In the last few years, neural networks have been intensively used to develop meaningful distributed representations of words and contexts around them. When these representations, also known as “embeddings”, are learned from unsupervised large corpora, they can be transferred to different tasks with positive effects in terms of performances, especially when only a few supervisions are available. In this work, we further extend this concept, and we present an unsupervised neural architecture that jointly learns word and context embeddings, processing words as sequences of characters. This allows our model to spot the regularities that are due to the word morphology, and to avoid the need of a fixed-sized input vocabulary of words. We show that we can learn compact encoders that, despite the relatively small number of parameters, reach high-level performances in downstream tasks, comparing them with related state-of-the-art approaches or with fully supervised methods.

Marra, G., Zugarini, A., Melacci, S., Maggini, M. (2018). An unsupervised character-aware neural approach to word and context representation learning. In Artificial Neural Networks and Machine Learning – ICANN 2018 (pp.126-136). Berlin : Springer Verlag [10.1007/978-3-030-01424-7_13].

An unsupervised character-aware neural approach to word and context representation learning

MARRA, GIUSEPPE;Zugarini, Andrea;Melacci, Stefano;Maggini, Marco
2018-01-01

Abstract

In the last few years, neural networks have been intensively used to develop meaningful distributed representations of words and contexts around them. When these representations, also known as “embeddings”, are learned from unsupervised large corpora, they can be transferred to different tasks with positive effects in terms of performances, especially when only a few supervisions are available. In this work, we further extend this concept, and we present an unsupervised neural architecture that jointly learns word and context embeddings, processing words as sequences of characters. This allows our model to spot the regularities that are due to the word morphology, and to avoid the need of a fixed-sized input vocabulary of words. We show that we can learn compact encoders that, despite the relatively small number of parameters, reach high-level performances in downstream tasks, comparing them with related state-of-the-art approaches or with fully supervised methods.
2018
9783030014230
Marra, G., Zugarini, A., Melacci, S., Maggini, M. (2018). An unsupervised character-aware neural approach to word and context representation learning. In Artificial Neural Networks and Machine Learning – ICANN 2018 (pp.126-136). Berlin : Springer Verlag [10.1007/978-3-030-01424-7_13].
File in questo prodotto:
File Dimensione Formato  
melacci_ICANN2018b.pdf

non disponibili

Tipologia: PDF editoriale
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 346.37 kB
Formato Adobe PDF
346.37 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1065981