We study the stability of a delayed Hopfield neural network with periodic coefficients and inputs and an arbitrary and constant delay. We consider non-decreasing activation functions which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumption of Lipschitz continuity on the activation functions, which is usually required in most of the papers. Under suitable assumptions on the interconnection matrices, we prove that the delayed neural network has a unique periodic solution which is globally exponentially stable independently of the size of the delay. The assumptions we exploit concern the theory of M-matrices and are easy to check. Due to the possible discontinuities of the activation functions, the convergence of the output of the neural network is also studied by a suitable notion of limit. The existence, uniqueness and continuability of the solution of suitable initial value problems are proved.
Papini, D., Taddei, V. (2005). Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations. PHYSICS LETTERS A, 343(1-3), 117-128 [10.1016/j.physleta.2005.06.015].
Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations
PAPINI, DUCCIO;
2005-01-01
Abstract
We study the stability of a delayed Hopfield neural network with periodic coefficients and inputs and an arbitrary and constant delay. We consider non-decreasing activation functions which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumption of Lipschitz continuity on the activation functions, which is usually required in most of the papers. Under suitable assumptions on the interconnection matrices, we prove that the delayed neural network has a unique periodic solution which is globally exponentially stable independently of the size of the delay. The assumptions we exploit concern the theory of M-matrices and are easy to check. Due to the possible discontinuities of the activation functions, the convergence of the output of the neural network is also studied by a suitable notion of limit. The existence, uniqueness and continuability of the solution of suitable initial value problems are proved.File | Dimensione | Formato | |
---|---|---|---|
37902_UPLOAD.pdf
non disponibili
Tipologia:
Post-print
Licenza:
NON PUBBLICO - Accesso privato/ristretto
Dimensione
169.74 kB
Formato
Adobe PDF
|
169.74 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/22994
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo