Simplicity and elegance have always been incredibly useful criteria for the development of successful theories that describe natural phenomena. Variational methods frame this parsimony principles into precise mathematical statements. In this thesis we showed how we can formulate learning theories using calculus of variations. Despite the natural way in which learning problem can be formulated by means of calculus of variations, we faced and analyzed the problem of causality that very soon rises when we try to formalize an evolution problem using integral functional indexes. We studied how to apply the theory of Lagrange multipliers to reformulate learning problems concerning NN in terms of constrained variational problems. We provide a variational formulation for the problem of unsupervised extraction of visual features from videos. In particular we used regularity, maximization of mutual information and the enforcement of dynamical consistency constraint to define an index based on convolutional features from which the dynamics of the convolutional filters are derived.
A Variational Framework for Laws of Learning
1 April 2020
| Category: PhD Theses