Supervised learning in multilayered neural networks (MLN's) has been recently proposed through the well-known backpropagation (BP) algorithm. This is a gradient method that can get stuck in local minima, as simple examples can show. In this paper, some conditions on the network architecture and the learning environment, which ensure the convergence of the BP algorithm, are proposed. It is proven in particular that the convergence holds if the classes are linearly separable. In this case, the experience gained in several experiments shows that MLN's exceed perceptrons in generalization to new examples.
Gori, M., Tesi, A. (1992). On the problem of local minima in backpropagation. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 14(1), 76-86 [10.1109/34.107014].
On the problem of local minima in backpropagation
Gori M.;
1992-01-01
Abstract
Supervised learning in multilayered neural networks (MLN's) has been recently proposed through the well-known backpropagation (BP) algorithm. This is a gradient method that can get stuck in local minima, as simple examples can show. In this paper, some conditions on the network architecture and the learning environment, which ensure the convergence of the BP algorithm, are proposed. It is proven in particular that the convergence holds if the classes are linearly separable. In this case, the experience gained in several experiments shows that MLN's exceed perceptrons in generalization to new examples.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/34977
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo