This paper deals with a class of large-scale nonlinear dynamical systems, namely the additive neural networks. It is well known that convergence of neural network trajectories towards equilibrium points is a fundamental dynamical property, especially in view of the increasing number of applications which involve the solution of signal processing tasks in real time. In particular, an additive neural network is said to be absolutely stable if it is convergent for all parameters and all nonlinear functions belonging to some specified and well characterized sets, including situations where the network possesses infinite non-isolated equilibrium points. The main result in this paper is that additive neural networks enjoy the property of absolute stability of convergence within the set of diagonal self-inhibition matrices, the set of symmetric neuron interconnection matrices, and the set of sigmoidal piecewise analytic neuron activations. The result is proved by generalizing a method for neural network convergence introduced in a recent paper, which is based on showing that the length of each forward trajectory of the neural network is finite. The advantages of the result in this paper over previous ones on neural network convergence established by means of LaSalle approach are discussed.

DI MARCO, M., Forti, M., Tesi, A. (2003). On absolute stability of convergence for nonlinear neural network models. In New Trends in Nonlinear Dynamics and Control, and Their Applications (pp. 209-220). BERLIN : SPRINGER VERLAG.

On absolute stability of convergence for nonlinear neural network models

DI MARCO, MAURO;FORTI, MAURO;
2003-01-01

Abstract

This paper deals with a class of large-scale nonlinear dynamical systems, namely the additive neural networks. It is well known that convergence of neural network trajectories towards equilibrium points is a fundamental dynamical property, especially in view of the increasing number of applications which involve the solution of signal processing tasks in real time. In particular, an additive neural network is said to be absolutely stable if it is convergent for all parameters and all nonlinear functions belonging to some specified and well characterized sets, including situations where the network possesses infinite non-isolated equilibrium points. The main result in this paper is that additive neural networks enjoy the property of absolute stability of convergence within the set of diagonal self-inhibition matrices, the set of symmetric neuron interconnection matrices, and the set of sigmoidal piecewise analytic neuron activations. The result is proved by generalizing a method for neural network convergence introduced in a recent paper, which is based on showing that the length of each forward trajectory of the neural network is finite. The advantages of the result in this paper over previous ones on neural network convergence established by means of LaSalle approach are discussed.
2003
3540404740
DI MARCO, M., Forti, M., Tesi, A. (2003). On absolute stability of convergence for nonlinear neural network models. In New Trends in Nonlinear Dynamics and Control, and Their Applications (pp. 209-220). BERLIN : SPRINGER VERLAG.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/25363
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo