The paper considers a class of additive neural networks where the neuron activations are modeled by discontinuous functions or by continuous non-Lipschitz functions. Some tools are developed which enable us to apply a Lyapunov-like approach to differential equations with discontinuous right-hand side modeling the neural network dynamics. The tools include a chain rule for computing the time derivative along the neural network solutions of a nondifferentiable Lyapunov function, and a comparison principle for this time derivative, which yields conditions for exponential convergence or convergence in finite time. By means of the Lyapunov-like approach, a general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions. Moreover, new results on global convergence in finite time are established, which are applicable to neuron activations with jump discontinuities, or neuron activations modeled by means of continuous (non-Lipschitz) Holder functions.

Forti, M., M., G., Nistri, P., Pancioni, L. (2006). Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations. PHYSICA D-NONLINEAR PHENOMENA, 214, 88-99 [10.1016/j.physd.2005.12.006].

Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations

FORTI, MAURO;NISTRI, PAOLO;PANCIONI, LUCA
2006-01-01

Abstract

The paper considers a class of additive neural networks where the neuron activations are modeled by discontinuous functions or by continuous non-Lipschitz functions. Some tools are developed which enable us to apply a Lyapunov-like approach to differential equations with discontinuous right-hand side modeling the neural network dynamics. The tools include a chain rule for computing the time derivative along the neural network solutions of a nondifferentiable Lyapunov function, and a comparison principle for this time derivative, which yields conditions for exponential convergence or convergence in finite time. By means of the Lyapunov-like approach, a general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions. Moreover, new results on global convergence in finite time are established, which are applicable to neuron activations with jump discontinuities, or neuron activations modeled by means of continuous (non-Lipschitz) Holder functions.
2006
Forti, M., M., G., Nistri, P., Pancioni, L. (2006). Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations. PHYSICA D-NONLINEAR PHENOMENA, 214, 88-99 [10.1016/j.physd.2005.12.006].
File in questo prodotto:
File Dimensione Formato  
435699-U-GOV.pdf

non disponibili

Tipologia: Post-print
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 349.44 kB
Formato Adobe PDF
349.44 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/26473
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo