Recurrent neural networks are an efficient tool for the solution of problems of automatic speech recognition. On the other hand, using the Reduced Gradient (RG) algorithm for isolated word recognition is known to be better than Backpropagation in locally recurrent architectures. The implementation of RG guarantees the convergence to a Kuhn-Tucker point which, however, is in general neither the global minimum nor a satisfactory sub-optimal solution. In this paper we have experimented a new heuristic method based on a combined use of RG and a deterministic global algorithm for unconstrained optimisation. The preliminary numerical experiences of such method have evidenced very promising results.
Bianchini, M., S., F., Gori, M., Maggini, M., M., P. (1995). A Heuristic Global Optimisation Algorithm for Training Recurrent Neural Networks. In Proceedings of ICANN 1995 (pp.601-605).
A Heuristic Global Optimisation Algorithm for Training Recurrent Neural Networks
BIANCHINI, MONICA;GORI, MARCO;MAGGINI, MARCO;
1995-01-01
Abstract
Recurrent neural networks are an efficient tool for the solution of problems of automatic speech recognition. On the other hand, using the Reduced Gradient (RG) algorithm for isolated word recognition is known to be better than Backpropagation in locally recurrent architectures. The implementation of RG guarantees the convergence to a Kuhn-Tucker point which, however, is in general neither the global minimum nor a satisfactory sub-optimal solution. In this paper we have experimented a new heuristic method based on a combined use of RG and a deterministic global algorithm for unconstrained optimisation. The preliminary numerical experiences of such method have evidenced very promising results.File | Dimensione | Formato | |
---|---|---|---|
ICANN95.pdf
non disponibili
Tipologia:
Post-print
Licenza:
NON PUBBLICO - Accesso privato/ristretto
Dimensione
450.13 kB
Formato
Adobe PDF
|
450.13 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/18133
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo