Deep learning allows to develop feature representations and train classification models in a fully integrated way. However, learning deep networks is quite hard and it improves over shallow architectures only if a large number of training data is available. Injecting prior knowledge into the learner is a principled way to reduce the amount of required training data, as the learner does not need to induce the knowledge from the data itself. In this paper we propose a general and principled way to integrate prior knowledge when training deep networks. Semantic Based Regularization (SBR) is used as underlying framework to represent the prior knowledge, expressed as a collection of first-order logic clauses (FOL), and where each task to be learned corresponds to a predicate in the knowledge base. The knowledge base correlates the tasks to be learned and it is translated into a set of constraints which are integrated into the learning process via backpropagation. The experimental results show how the integration of the prior knowledge boosts the accuracy of a state-of-the-art deep network on an image classification task.

Diligenti, M., Roychowdhury, S., Gori, M. (2018). Integrating prior knowledge into deep learning. In Proceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017 (pp.920-923). New York : Institute of Electrical and Electronics Engineers Inc. [10.1109/ICMLA.2017.00-37].

Integrating prior knowledge into deep learning

Diligenti M.;Gori M.
2018-01-01

Abstract

Deep learning allows to develop feature representations and train classification models in a fully integrated way. However, learning deep networks is quite hard and it improves over shallow architectures only if a large number of training data is available. Injecting prior knowledge into the learner is a principled way to reduce the amount of required training data, as the learner does not need to induce the knowledge from the data itself. In this paper we propose a general and principled way to integrate prior knowledge when training deep networks. Semantic Based Regularization (SBR) is used as underlying framework to represent the prior knowledge, expressed as a collection of first-order logic clauses (FOL), and where each task to be learned corresponds to a predicate in the knowledge base. The knowledge base correlates the tasks to be learned and it is translated into a set of constraints which are integrated into the learning process via backpropagation. The experimental results show how the integration of the prior knowledge boosts the accuracy of a state-of-the-art deep network on an image classification task.
2018
978-153861417-4
Diligenti, M., Roychowdhury, S., Gori, M. (2018). Integrating prior knowledge into deep learning. In Proceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017 (pp.920-923). New York : Institute of Electrical and Electronics Engineers Inc. [10.1109/ICMLA.2017.00-37].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1082459