Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural em- beddings produced by current methods do not provide a suitable contextual similar- ity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a bi- nary text similarity. We set the latter con- sidering examples in the same category as similar. The experiments on question and sentiment classification show that our se- mantic TK highly improves previous re- sults.

Learning Contextual Embeddings for Structural Semantic Similarity using Categorical Information / Nicosia, Massimo; Moschitti, Alessandro. - ELETTRONICO. - (2017), pp. 260-270. (Intervento presentato al convegno CoNLL 2017 tenutosi a Vancouver, Canada nel July 2017) [10.18653/v1/K17-1027].

Learning Contextual Embeddings for Structural Semantic Similarity using Categorical Information

Massimo Nicosia;Alessandro Moschitti
2017-01-01

Abstract

Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural em- beddings produced by current methods do not provide a suitable contextual similar- ity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a bi- nary text similarity. We set the latter con- sidering examples in the same category as similar. The experiments on question and sentiment classification show that our se- mantic TK highly improves previous re- sults.
2017
Proceedings of the 21st Conference on Computational Natural LanguageLearning
USA
Association for Computational Linguistics (ACL)
978-1-945626-54-8
Nicosia, Massimo; Moschitti, Alessandro
Learning Contextual Embeddings for Structural Semantic Similarity using Categorical Information / Nicosia, Massimo; Moschitti, Alessandro. - ELETTRONICO. - (2017), pp. 260-270. (Intervento presentato al convegno CoNLL 2017 tenutosi a Vancouver, Canada nel July 2017) [10.18653/v1/K17-1027].
File in questo prodotto:
File Dimensione Formato  
2017_CoNLL.pdf

accesso aperto

Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Creative commons
Dimensione 265.36 kB
Formato Adobe PDF
265.36 kB Adobe PDF Visualizza/Apri
K17-1027.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 221.4 kB
Formato Adobe PDF
221.4 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/195339
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact