Dec 2, 2020 – 11:00 – 11:45 AM
Emotions have an important role in daily life, indeed influence decision-making, human interaction, perception, attention, self-regulation. In the early 1970s, the psychologist Paul Ekman defined six universal emotions, namely anger, disgust, fear, happiness, sadness and surprise. This categorization has been taken into account for several and different studies. In the late 1990s, Affective Computing was born, a new interdisciplinary discipline spanning between computer science, psychology, and cognitive science. Affective Computing aims at developing intelligent systems able to recognize, interpret, process, and simulate human emotions. It has a wide range of applications, as healthcare, education, games, entertainment, marketing, automated driver assistance, robotics, and many others. Emotions can be detected from different channels, such as facial expressions, body gestures, speech, text, physiological signals.
With the great success of deep learning, deep architectures have been employed also for many Affective Computing tasks.
In this thesis, a detailed study of emotions has been carried out using deep learning techniques for various tasks, such as facial expression recognition, text and speech emotion recognition, facial expression generation and sketch generation. By the way, deep learning methods to properly perform in general require a great computing power and large collections of labeled data. To overcome these limitations prior knowledge can be injected into the learning problem following the framework of Learning from Constraints. This is reached requiring the satisfaction of a set of constraints during the learning process.
In this work several constraints have been employed for different reasons.