The paper develops a Lyapunov method, which is based on a generalized version of LaSalle's invariance principle, for studying convergence and stability of the differential inclusions modeling the dynamics of the full-range (FR) model of cellular neural networks (CNNs). The method is applied to yield a rigorous proof of convergence for symmetric FR-CNNs. The proof, which is a direct consequence of the fact that a symmetric FR-CNN admits a strict Lyapunov function, is much more simple than the corresponding proof of convergence for symmetric standard CNNs.

DI MARCO, M., Forti, M., Grazzini, M., & Pancioni, L. (2008). Extended LaSalle’s invariance principle for full-range cellular neural networks. In Proceedings of IEEE 11th International Worshop on Cellular Neural Networks and their applications, 2008 (CNNA 2008) (pp.46-51) [10.1109/CNNA.2008.4588648].

Extended LaSalle’s invariance principle for full-range cellular neural networks

DI MARCO, MAURO;FORTI, MAURO;PANCIONI, LUCA
2008

Abstract

The paper develops a Lyapunov method, which is based on a generalized version of LaSalle's invariance principle, for studying convergence and stability of the differential inclusions modeling the dynamics of the full-range (FR) model of cellular neural networks (CNNs). The method is applied to yield a rigorous proof of convergence for symmetric FR-CNNs. The proof, which is a direct consequence of the fact that a symmetric FR-CNN admits a strict Lyapunov function, is much more simple than the corresponding proof of convergence for symmetric standard CNNs.
9781424420902
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11365/2579
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo