This paper proposes a theory for understanding perceptual learning processes within the general framework of laws of nature. Artificial neural networks are regarded as systems whose connections are Lagrangian variables, namely, functions depending on time. They are used to minimize the cognitive action, an appropriate functional index that measures the agent interactions with the environment. The cognitive action contains a potential and a kinetic term that nicely resemble the classic formulation of regularization in machine learning. A special choice of the functional index, which leads to the fourth-order differential equations--Cognitive Action Laws (CAL)--exhibits a structure that mirrors classic formulation of machine learning. In particular, unlike the action of mechanics, the stationarity condition corresponds with the global minimum. Moreover, it is proven that typical asymptotic learning conditions on the weights can coexist with the initialization provided that the system dynamics is driven under a policy referred to as information overloading control. Finally, the theory is experimented for the problem of feature extraction in computer vision.

Betti, A., Gori, M., Melacci, S. (2020). Cognitive Action Laws: The Case of Visual Features. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 31(3), 938-949 [10.1109/TNNLS.2019.2911174].

Cognitive Action Laws: The Case of Visual Features

Betti, Alessandro;Gori, Marco;Melacci, Stefano
2020-01-01

Abstract

This paper proposes a theory for understanding perceptual learning processes within the general framework of laws of nature. Artificial neural networks are regarded as systems whose connections are Lagrangian variables, namely, functions depending on time. They are used to minimize the cognitive action, an appropriate functional index that measures the agent interactions with the environment. The cognitive action contains a potential and a kinetic term that nicely resemble the classic formulation of regularization in machine learning. A special choice of the functional index, which leads to the fourth-order differential equations--Cognitive Action Laws (CAL)--exhibits a structure that mirrors classic formulation of machine learning. In particular, unlike the action of mechanics, the stationarity condition corresponds with the global minimum. Moreover, it is proven that typical asymptotic learning conditions on the weights can coexist with the initialization provided that the system dynamics is driven under a policy referred to as information overloading control. Finally, the theory is experimented for the problem of feature extraction in computer vision.
2020
Betti, A., Gori, M., Melacci, S. (2020). Cognitive Action Laws: The Case of Visual Features. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 31(3), 938-949 [10.1109/TNNLS.2019.2911174].
File in questo prodotto:
File Dimensione Formato  
melacci_TNNLS2020.pdf

non disponibili

Tipologia: PDF editoriale
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 2.47 MB
Formato Adobe PDF
2.47 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1082499