Giuseppe Alessio D’Inverno (University of Siena)
Nov 18, 2020 – 11:45 – 12:30 AM
Neural networks are nowadays one of the hottest topics in Applied Mathematics and, from Bioinformatics to Artificial Intelligence, their use spreads over several scientific disciplines. Beside the analytical approach, which is predominant in the literature, a new geometric point of view has been developed recently. Tools from projective Geometry, algebraic Geometry and Combinatorics, applied to Statistics, provide a new way to manage models with several variables.
In the thesis, we study the Algebraic Statistics of boolean neural networks, i.e. neural networks in which a signal can be only 0 or 1, with an interference, represented by a Jukes Cantor matrix, which affects the transmission of the signal from one node to the others.
In this work we are interested in a particular Boolean neural networks, that we call funnel networks , where the first n-1 nodes are connected in chain (we call this particular subnetwork Markov JC network) and any node of this chain sends a signal to the last node, which is supposed to be the neck of the funnel.
Assuming that the special node will return a 0 or a 1 signal according to a specific threshold, our aim is to retrieve the thresholds, for generic funnel networks with n nodes, by analysing the statistical distribution of the network, that could be represented by a tensor. We exploit parametric and algebraic equations of a general funnel network, for any value of the threshold, and prove how the threshold can be retrieved from the resulting tensors.
Applications to more general types of networks are outlined.