Word embeddings are widely used in Nat-ural Language Processing, mainly due totheir success in capturing semantic infor-mation from massive corpora. However,their creation process does not allow thedifferent meanings of a word to be auto-matically separated, as it conflates theminto a single vector. We address this issueby proposing a new model which learnsword and sense embeddings jointly. Ourmodel exploits large corpora and knowl-edge from semantic networks in order toproduce a unified vector space of wordand sense embeddings. We evaluate themain features of our approach both qual-itatively and quantitatively in a variety oftasks, highlighting the advantages of theproposed method in comparison to state-of-the-art word- and sense-based models.

Embedding Words and Senses Together via Joint Knowledge-Enhanced Training / Mancini, Massimiliano; CAMACHO COLLADOS, Jose'; Iacobacci, IGNACIO JAVIER; Navigli, Roberto. - ELETTRONICO. - (2017), pp. 100-111. (Intervento presentato al convegno 21st Conference on Computational Natural Language Learning (CoNLL 2017) tenutosi a Vancouver; Canada nel 3-4 Agosto 2017) [10.18653/v1/K17-1012].

Embedding Words and Senses Together via Joint Knowledge-Enhanced Training

Mancini, Massimiliano;
2017-01-01

Abstract

Word embeddings are widely used in Nat-ural Language Processing, mainly due totheir success in capturing semantic infor-mation from massive corpora. However,their creation process does not allow thedifferent meanings of a word to be auto-matically separated, as it conflates theminto a single vector. We address this issueby proposing a new model which learnsword and sense embeddings jointly. Ourmodel exploits large corpora and knowl-edge from semantic networks in order toproduce a unified vector space of wordand sense embeddings. We evaluate themain features of our approach both qual-itatively and quantitatively in a variety oftasks, highlighting the advantages of theproposed method in comparison to state-of-the-art word- and sense-based models.
2017
CoNLL 2017. The 21st Conference on Computational Natural Language Learning. Proceedings of the Conference, August 3 - August 4, 2017 Vancouver, Canada
Stroudsburg PA (USA)
The Association for Computational Linguistics
978-1-945626-54-8
Mancini, Massimiliano; CAMACHO COLLADOS, Jose'; Iacobacci, IGNACIO JAVIER; Navigli, Roberto
Embedding Words and Senses Together via Joint Knowledge-Enhanced Training / Mancini, Massimiliano; CAMACHO COLLADOS, Jose'; Iacobacci, IGNACIO JAVIER; Navigli, Roberto. - ELETTRONICO. - (2017), pp. 100-111. (Intervento presentato al convegno 21st Conference on Computational Natural Language Learning (CoNLL 2017) tenutosi a Vancouver; Canada nel 3-4 Agosto 2017) [10.18653/v1/K17-1012].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/385009
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 49
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact