A Variational Framework for Laws of Learning

Simplicity and elegance have always been incredibly useful criteria for the development of successful theories that describe natural phenomena. Variational methods frame this parsimony principles into precise mathematical statements. In this thesis we showed how we can formulate learning theories using calculus of variations. Despite the natural way in which learning problem can be formulated […]

Read More »

On the Integration of Logic and Learning

Giannini’s thesis A key point in the success of machine learning, and in particular deep learning, has been the availability of high-performance computing architectures allowing to process a large amount of data. However, this potentially prevents a wider application of machine learning in real world applications, where the collection of training data is often a […]

Read More »