On the Integration of Logic and Learning

Giannini’s thesis

A key point in the success of machine learning, and in particular deep learning, has been the availability of high-performance computing architectures allowing to process a large amount of data. However, this potentially prevents a wider application of machine learning in real world applications, where the collection of training data is often a slow and expensive process, requiring an extensive human intervention. This suggests to look at possible ways to overcome this limitation, for instance injecting prior knowledge into a learning problem to express some desired behaviors for the functions to be learned.
In this thesis, we consider the case of prior knowledge expressed by means of first-order logic formulas to be integrated into a learning problem. In particular, at first the formulas are converted into real-valued functions by means of t-norm fuzzy logic operators. Thereafter, a loss component (a constraint) is assigned to any function representing a formula and all these components are aggregated (e.g. summed) together with other possible loss components, e.g. a regularization term or some loss components associated to supervisions, if they are available for the functions to be learned. Both the functional representation of a formula and the mapping into a loss component have been
investigated, and some theoretical results are discussed to get an insight on how to bring some benefits for different learning schemata. In particular we define a fragment of Lukasiewicz logic that guarantees to yield convex functional constraints given any knowledge base made of first-order logic formulas. The convexity of these constraints is exploited to formulate collective classification as a quadratic optimization problem and some experimental results are discussed. In addition, we extend classic Support Vector Machines with logical constraints, still preserving quadratic programming resolution. Since formulas may be logically depending on each other, some of the constraints
may turn out to be unnecessary with respect to the learning process. This suggests to generalize the notion of support vector to support constraint, and we provide both logical and algebraic criteria to determine the constraints that are unnecessary. Finally we present LYRICS, a general interface implemented in TensorFlow to integrate both deep learning architectures and a first-order logic representation of knowledge for a learning problem. In particular, we show several learning tasks that may be addressed in LYRICS, with a special discussion for the case of visual generation.

 |  Category: PhD Theses