In this paper we present a foundational study on a constrained method that defines learning problems with Neural Networks in the context of the Least Cognitive Action, which very much resembles the Least Action Principle in mechanics. Starting from a general approach to enforce constraints into the dynamical laws of learning, this work focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches. In particular, the structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data, leading to “architectural” and “input-related” constraints, respectively. The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner, that makes this study an important step toward alternative ways of processing continuous streams of data with Neural Networks. The connection with the classic Backpropagation-based update rule of the weights of networks is discussed, showing that there are conditions under which our approach degenerates to Backpropagation. Moreover, the theory is experimentally evaluated on a simple problem that allows us to deeply study several aspects of the theory itself and to show the soundness of the model.

Betti, A., Gori, M., Marullo, S., Melacci, S. (2020). Developing constrained neural units over time. In Proceedings of the International Joint Conference on Neural Networks. New York : Institute of Electrical and Electronics Engineers Inc. [10.1109/IJCNN48605.2020.9207028].

Developing constrained neural units over time

Marco Gori;Stefano Melacci
2020-01-01

Abstract

In this paper we present a foundational study on a constrained method that defines learning problems with Neural Networks in the context of the Least Cognitive Action, which very much resembles the Least Action Principle in mechanics. Starting from a general approach to enforce constraints into the dynamical laws of learning, this work focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches. In particular, the structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data, leading to “architectural” and “input-related” constraints, respectively. The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner, that makes this study an important step toward alternative ways of processing continuous streams of data with Neural Networks. The connection with the classic Backpropagation-based update rule of the weights of networks is discussed, showing that there are conditions under which our approach degenerates to Backpropagation. Moreover, the theory is experimentally evaluated on a simple problem that allows us to deeply study several aspects of the theory itself and to show the soundness of the model.
2020
978-1-7281-6926-2
Betti, A., Gori, M., Marullo, S., Melacci, S. (2020). Developing constrained neural units over time. In Proceedings of the International Joint Conference on Neural Networks. New York : Institute of Electrical and Electronics Engineers Inc. [10.1109/IJCNN48605.2020.9207028].
File in questo prodotto:
File Dimensione Formato  
melacci_marullo_IJCNN2020.pdf

non disponibili

Tipologia: PDF editoriale
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 1.5 MB
Formato Adobe PDF
1.5 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1106285