By and large, the interpretation of learning as a computational process taking place in both humans and machines is primarily provided in the framework of statistics. In this paper, we propose a radically different perspective in which the emergence of learning is regarded as the outcome of laws of nature that govern the interactions of intelligent agents with their own environment. We introduce a natural learning theory based on the principle of least cognitive action, which is inspired to the related mechanical principle, and to the Hamiltonian framework for modeling the motion of particles. The introduction of the kinetic and of the potential energy leads to a surprisingly natural interpretation of learning as a dissipative process. The kinetic energy reflects the temporal variation of the synaptic connections, while the potential energy is a penalty that describes the degree of satisfaction of the environmental constraints. The theory gives a picture of learning in terms of the energy balancing mechanisms, where the novel notions of boundary and bartering energies are introduced. Finally, as an example of application of the theory, we show that the supervised machine learning scheme can be framed in the proposed theory and, in particular, we show that the Euler Lagrange differential equations of learning collapse to the classic gradient algorithm on the supervised pairs.

Betti, A., Gori, M. (2016). The principle of least cognitive action. THEORETICAL COMPUTER SCIENCE, 633, 83-99 [10.1016/j.tcs.2015.06.042].

The principle of least cognitive action

Betti, Alessandro;Gori, Marco
2016-01-01

Abstract

By and large, the interpretation of learning as a computational process taking place in both humans and machines is primarily provided in the framework of statistics. In this paper, we propose a radically different perspective in which the emergence of learning is regarded as the outcome of laws of nature that govern the interactions of intelligent agents with their own environment. We introduce a natural learning theory based on the principle of least cognitive action, which is inspired to the related mechanical principle, and to the Hamiltonian framework for modeling the motion of particles. The introduction of the kinetic and of the potential energy leads to a surprisingly natural interpretation of learning as a dissipative process. The kinetic energy reflects the temporal variation of the synaptic connections, while the potential energy is a penalty that describes the degree of satisfaction of the environmental constraints. The theory gives a picture of learning in terms of the energy balancing mechanisms, where the novel notions of boundary and bartering energies are introduced. Finally, as an example of application of the theory, we show that the supervised machine learning scheme can be framed in the proposed theory and, in particular, we show that the Euler Lagrange differential equations of learning collapse to the classic gradient algorithm on the supervised pairs.
2016
Betti, A., Gori, M. (2016). The principle of least cognitive action. THEORETICAL COMPUTER SCIENCE, 633, 83-99 [10.1016/j.tcs.2015.06.042].
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S0304397515005526-main.pdf

accesso aperto

Tipologia: PDF editoriale
Licenza: Creative commons
Dimensione 416.93 kB
Formato Adobe PDF
416.93 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/983443