Francesco Giannini (DIISM, University of Siena)
March 29, 2018 – 9:30 AM
DIISM, Artificial Intelligence laboratory (room 201), Siena SI
The success of support vector machines lies in the fact that only a small portion of data are significant to determine the maximum margin hyperplane separating two opposite class of labeled examples, namely the so called support vectors. In the same way, in a general learning from constraints problem, where possibly several constraints are considered, some of them can turn out to be unnecessary with respect to the learning process. In this work, we extend the definition of support vector to support constraint and we provide some conditions to reduce the number of constraints to be enforced still producing the same optimal solution in a learning problem.