This paper introduces a general class of neural networks with arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. The discontinuities in the activations are an ideal model of the situation where the gain of the neuron amplifiers is very high and tends to infinity, while the delay accounts for the finite switching speed of the neuron amplifiers, or the finite signal propagation speed. It is known that the delay in combination with high-gain nonlinearities is a particularly harmful source of potential instability. The goal of this paper is to single out a subclass of the considered discontinuous neural networks for which stability is instead insensitive to the presence of a delay. More precisely, conditions are given under which there is a unique equilibrium point of the neural network, which is globally exponentially stable for the states, with a known convergence rate. The conditions are easily testable and independent of the delay. Moreover, global convergence in finite time of the state and output is investigated. In doing so, new interesting dynamical phenomena are highlighted with respect to the case without delay, which make the study of convergence in finite time significantly more difficult. The obtained results extend previous work on global stability of delayed neural networks with Lipschitz continuous neuron activations, and neural networks with discontinuous neuron activations but without delays.

Forti, M., Nistri, P., Papini, D. (2005). Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain. IEEE TRANSACTIONS ON NEURAL NETWORKS, 16(6), 1449-1463 [10.1109/TNN.2005.852862].

Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain

FORTI, MAURO;NISTRI, PAOLO;PAPINI, DUCCIO
2005-01-01

Abstract

This paper introduces a general class of neural networks with arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. The discontinuities in the activations are an ideal model of the situation where the gain of the neuron amplifiers is very high and tends to infinity, while the delay accounts for the finite switching speed of the neuron amplifiers, or the finite signal propagation speed. It is known that the delay in combination with high-gain nonlinearities is a particularly harmful source of potential instability. The goal of this paper is to single out a subclass of the considered discontinuous neural networks for which stability is instead insensitive to the presence of a delay. More precisely, conditions are given under which there is a unique equilibrium point of the neural network, which is globally exponentially stable for the states, with a known convergence rate. The conditions are easily testable and independent of the delay. Moreover, global convergence in finite time of the state and output is investigated. In doing so, new interesting dynamical phenomena are highlighted with respect to the case without delay, which make the study of convergence in finite time significantly more difficult. The obtained results extend previous work on global stability of delayed neural networks with Lipschitz continuous neuron activations, and neural networks with discontinuous neuron activations but without delays.
Forti, M., Nistri, P., Papini, D. (2005). Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain. IEEE TRANSACTIONS ON NEURAL NETWORKS, 16(6), 1449-1463 [10.1109/TNN.2005.852862].
File in questo prodotto:
File Dimensione Formato  
37900_UPLOAD.pdf

non disponibili

Tipologia: Post-print
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 548.5 kB
Formato Adobe PDF
548.5 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/23650
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo