Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that pattern mode backpropagation converges to an optimal solution for linearly separable patterns.

Gori, M., Maggini, M. (1996). Optimal convergence of on-line backpropagation. IEEE TRANSACTIONS ON NEURAL NETWORKS, 7(1), 251-254 [10.1109/72.478415].

Optimal convergence of on-line backpropagation

GORI, MARCO;MAGGINI, MARCO
1996-01-01

Abstract

Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that pattern mode backpropagation converges to an optimal solution for linearly separable patterns.
1996
Gori, M., Maggini, M. (1996). Optimal convergence of on-line backpropagation. IEEE TRANSACTIONS ON NEURAL NETWORKS, 7(1), 251-254 [10.1109/72.478415].
File in questo prodotto:
File Dimensione Formato  
TNN96.pdf

non disponibili

Tipologia: Post-print
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 421.02 kB
Formato Adobe PDF
421.02 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/7633
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo