A learning paradigm is proposed, in which one has both classical supervised examples and constraints that cannot be violated, called here “hard constraints”, such as those enforcing the probabilistic normalization of a density function or imposing coherent decisions of the classifiers acting on different views of the same pattern. In contrast, supervised examples can be violated at the cost of some penalization (quantified by the choice of a suitable loss function) and so play the roles of “soft constraints”. Constrained variational calculus is exploited to derive a representation theorem which provides a description of the “optimal body of the agent”, i.e. the functional structure of the solution to the proposed learning problem. It is shown that the solution can be represented in terms of a set of “support constraints”, thus extending the well-known notion of “support vectors”.

Gnecco, G., Gori, M., Melacci, S., Sanguineti, M. (2013). Learning with Hard Constraints. In Artificial Neural Networks and Machine Learning - ICANN 2013 (pp.146-153). Berlin : SPRINGER-VERLAG [10.1007/978-3-642-40728-4_19].

Learning with Hard Constraints

GORI, MARCO;MELACCI, STEFANO;
2013-01-01

Abstract

A learning paradigm is proposed, in which one has both classical supervised examples and constraints that cannot be violated, called here “hard constraints”, such as those enforcing the probabilistic normalization of a density function or imposing coherent decisions of the classifiers acting on different views of the same pattern. In contrast, supervised examples can be violated at the cost of some penalization (quantified by the choice of a suitable loss function) and so play the roles of “soft constraints”. Constrained variational calculus is exploited to derive a representation theorem which provides a description of the “optimal body of the agent”, i.e. the functional structure of the solution to the proposed learning problem. It is shown that the solution can be represented in terms of a set of “support constraints”, thus extending the well-known notion of “support vectors”.
2013
978-3-642-40727-7
978-3-642-40728-4
Gnecco, G., Gori, M., Melacci, S., Sanguineti, M. (2013). Learning with Hard Constraints. In Artificial Neural Networks and Machine Learning - ICANN 2013 (pp.146-153). Berlin : SPRINGER-VERLAG [10.1007/978-3-642-40728-4_19].
File in questo prodotto:
File Dimensione Formato  
melacci_ICANN2013b.pdf

non disponibili

Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 189.05 kB
Formato Adobe PDF
189.05 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/974369
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo