The paper considers a large class of additive neural networks where the neuron activations are modeled by discontinuous functions or by non-Lipschitz functions. A result is established guaranteeing that the state solutions and output solutions of the neural network are globally convergent in finite time toward a unique equilibrium point. The obtained result, which generalizes previous results on convergence in finite time in the literature, is of interest for designing neural networks aimed at solving global optimization problems in real time.
Forti, M., Grazzini, M., Nistri, P., Pancioni, L. (2006). A Result on Global Convergence in Finite Time for Nonsmooth Neural Networks. In IEEE International Symposium on Circuits and Systems, 2006. ISCAS 2006 (pp.759-762). New York : IEEE [10.1109/ISCAS.2006.1692696].
A Result on Global Convergence in Finite Time for Nonsmooth Neural Networks
Forti M.;Nistri P.;Pancioni L.
2006-01-01
Abstract
The paper considers a large class of additive neural networks where the neuron activations are modeled by discontinuous functions or by non-Lipschitz functions. A result is established guaranteeing that the state solutions and output solutions of the neural network are globally convergent in finite time toward a unique equilibrium point. The obtained result, which generalizes previous results on convergence in finite time in the literature, is of interest for designing neural networks aimed at solving global optimization problems in real time.File | Dimensione | Formato | |
---|---|---|---|
442110-U-GOV.pdf
non disponibili
Tipologia:
Post-print
Licenza:
NON PUBBLICO - Accesso privato/ristretto
Dimensione
381.17 kB
Formato
Adobe PDF
|
381.17 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/19023
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo