Constrained Affective Computing

Author: Lisa Graziani Date: May, 2021 Topics: Affective Computing, Learning from Constraints, Facial Expression Recognition, Text Emotion Recognition, Speech Emotion Recognition, Facial Expression Generation. Abstract Emotions have an important role in daily life, influence decision-making, human interaction, perception, attention, self-regulation. They have been studied since ancient times, philosophers have been always interested in analyzing human […]

Read More »

Language Models for Text Understanding and Generation

Author: Andrea Zugarini Date: May, 2021 Topics: Language Modeling; Language Generation; Language Understanding; Information Extraction. Abstract The ability to understand and generate language is one of the most fascinating and peculiar aspects of humankind. We can discuss with other individuals about facts, events, stories or the most abstract aspects of our existences, only because of […]

Read More »

Local Propagation in Neural Network Learning by Architectural Constraints

Author: Matteo Tiezzi Date: March, 2021 Topics: Learning by Constraints; Constraint optimization; Graph Neural Networks; Lifelong Learning. Abstract A crucial role for the success of the Artificial Neural Networks (ANN) processing scheme has been played by the feed-forward propagation of signals. The input patterns undergo a series of stacked parametrized transformations, which foster deep feature […]

Read More »

Towards Laws of Visual Attention

PH.D. THESIS Author: Dario Zanca Date: March, 2019 Topics: Computational modeling of visual attention; Computer vision; Machine Learning. Abstract Visual attention is a crucial process for humans and foveated animals in general. The ability to select relevant locations in the visual field greatly simplifies the problem of vision. It allows a parsimonious management of the […]

Read More »

A Variational Framework for Laws of Learning

Simplicity and elegance have always been incredibly useful criteria for the development of successful theories that describe natural phenomena. Variational methods frame this parsimony principles into precise mathematical statements. In this thesis we showed how we can formulate learning theories using calculus of variations. Despite the natural way in which learning problem can be formulated […]

Read More »

On the Integration of Logic and Learning

Giannini’s thesis A key point in the success of machine learning, and in particular deep learning, has been the availability of high-performance computing architectures allowing to process a large amount of data. However, this potentially prevents a wider application of machine learning in real world applications, where the collection of training data is often a […]

Read More »