Recurrent neural networks processing symbolic strings can be regarded as adaptive neural parsers. Given a set of positive and negative examples, picked up from a given language, adaptive neural parsers can effectively be trained to infer the language grammar. In this paper we use adaptive neural parsers to face the problem of inferring grammars from examples that are corrupted by a kind of noise that simply changes their membership. We propose a training algorithm, referred to as hybrid finite state filter (HFF), which is based on a parsimony principle that penalizes the development of complex rules. We report very promising experimental results showing that the proposed inductive inference scheme is indeed capable of capturing rules, while removing noise.

Gori, M., Maggini, M., Martinelli, E., Soda, G. (1998). Inductive inference from noisy examples using the hybrid finite state filter. IEEE TRANSACTIONS ON NEURAL NETWORKS, 9(3), 571-575 [10.1109/72.668898].

Inductive inference from noisy examples using the hybrid finite state filter

GORI, MARCO;MAGGINI, MARCO;MARTINELLI, ENRICO;
1998-01-01

Abstract

Recurrent neural networks processing symbolic strings can be regarded as adaptive neural parsers. Given a set of positive and negative examples, picked up from a given language, adaptive neural parsers can effectively be trained to infer the language grammar. In this paper we use adaptive neural parsers to face the problem of inferring grammars from examples that are corrupted by a kind of noise that simply changes their membership. We propose a training algorithm, referred to as hybrid finite state filter (HFF), which is based on a parsimony principle that penalizes the development of complex rules. We report very promising experimental results showing that the proposed inductive inference scheme is indeed capable of capturing rules, while removing noise.
1998
Gori, M., Maggini, M., Martinelli, E., Soda, G. (1998). Inductive inference from noisy examples using the hybrid finite state filter. IEEE TRANSACTIONS ON NEURAL NETWORKS, 9(3), 571-575 [10.1109/72.668898].
File in questo prodotto:
File Dimensione Formato  
TNN98.pdf

non disponibili

Tipologia: Post-print
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 164.02 kB
Formato Adobe PDF
164.02 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/30800
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo