The paper introduces a general class of neural networks where the neuron activations are modeled by discontinuous functions. The neural networks have an additive interconnecting structure and they include as particular cases the Hopfield neural networks (HNNs), and the standard cellular neural networks (CNNs), in the limiting situation where the HNNs and CNNs possess neurons with infinite gain. Conditions are derived which ensure the existence of a unique equilibrium point, and a unique output equilibrium point, which are globally attractive for the state and the output trajectories of the neural network, respectively. These conditions, which are applicable to general nonsymmetric neural networks, are based on the concept of Lyapunov diagonally-stable neuron interconnection matrices, and they can be thought of as a generalization to the discontinuous case of previous results established for neural networks possessing smooth neuron activations. Moreover, by suitably exploiting the presence of sliding modes, entirely new conditions are obtained which ensure global convergence infinite time, where the convergence time can be easily estimated on the basis of the relevant neural-network parameters. The analysis in the paper employs results from the theory of differential equations with discontinuous right-hand side as introduced by Filippov. In particular, global convergence is addressed by using a Lyapunov-like approach based on the concept of monotone trajectories of a differential inclusion.

Forti, M., Nistri, P. (2003). Global convergence of neural networks with discontinuous neuron activations. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I. FUNDAMENTAL THEORY AND APPLICATIONS, 50(11), 1421-1435 [10.1109/TCSI.2003.818614].

Global convergence of neural networks with discontinuous neuron activations

Forti M.;Nistri P.
2003-01-01

Abstract

The paper introduces a general class of neural networks where the neuron activations are modeled by discontinuous functions. The neural networks have an additive interconnecting structure and they include as particular cases the Hopfield neural networks (HNNs), and the standard cellular neural networks (CNNs), in the limiting situation where the HNNs and CNNs possess neurons with infinite gain. Conditions are derived which ensure the existence of a unique equilibrium point, and a unique output equilibrium point, which are globally attractive for the state and the output trajectories of the neural network, respectively. These conditions, which are applicable to general nonsymmetric neural networks, are based on the concept of Lyapunov diagonally-stable neuron interconnection matrices, and they can be thought of as a generalization to the discontinuous case of previous results established for neural networks possessing smooth neuron activations. Moreover, by suitably exploiting the presence of sliding modes, entirely new conditions are obtained which ensure global convergence infinite time, where the convergence time can be easily estimated on the basis of the relevant neural-network parameters. The analysis in the paper employs results from the theory of differential equations with discontinuous right-hand side as introduced by Filippov. In particular, global convergence is addressed by using a Lyapunov-like approach based on the concept of monotone trajectories of a differential inclusion.
2003
Forti, M., Nistri, P. (2003). Global convergence of neural networks with discontinuous neuron activations. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I. FUNDAMENTAL THEORY AND APPLICATIONS, 50(11), 1421-1435 [10.1109/TCSI.2003.818614].
File in questo prodotto:
File Dimensione Formato  
n.76.pdf

non disponibili

Tipologia: Post-print
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 1.13 MB
Formato Adobe PDF
1.13 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/2619
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo