Text classification is a fundamental task with broad applications in natural language processing. Recently, graph neural networks (GNNs) have attracted much attention due to their powerful representation ability. However, most existing methods for text classification based on GNNs consider only one-hop neighborhoods and low-frequency information within texts, which cannot fully utilize the rich context information of documents. Moreover, these models suffer from over-smoothing issues if many graph layers are stacked. In this paper, a Deep Attention Diffusion Graph Neural Network (DADGNN) model is proposed to learn text representations, bridging the chasm of interaction difficulties between a word and its distant neighbors. Experimental results on various standard benchmark datasets demonstrate the superior performance of the present approach.

Deep Attention Diffusion Graph Neural Networks for Text Classification / Liu, Yonghao; Guan, Renchu; Giunchiglia, Fausto; Liang, Yanchun; Feng, Xiaoyue. - (2021), pp. 8142-8152. (Intervento presentato al convegno EMNLP tenutosi a Online and Punta Cana, Dominican Republic nel 7-11 November 2021) [10.18653/v1/2021.emnlp-main.642].

Deep Attention Diffusion Graph Neural Networks for Text Classification

Giunchiglia, Fausto;Liang, Yanchun;
2021-01-01

Abstract

Text classification is a fundamental task with broad applications in natural language processing. Recently, graph neural networks (GNNs) have attracted much attention due to their powerful representation ability. However, most existing methods for text classification based on GNNs consider only one-hop neighborhoods and low-frequency information within texts, which cannot fully utilize the rich context information of documents. Moreover, these models suffer from over-smoothing issues if many graph layers are stacked. In this paper, a Deep Attention Diffusion Graph Neural Network (DADGNN) model is proposed to learn text representations, bridging the chasm of interaction difficulties between a word and its distant neighbors. Experimental results on various standard benchmark datasets demonstrate the superior performance of the present approach.
2021
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
S.l.
ACM
9781955917094
Liu, Yonghao; Guan, Renchu; Giunchiglia, Fausto; Liang, Yanchun; Feng, Xiaoyue
Deep Attention Diffusion Graph Neural Networks for Text Classification / Liu, Yonghao; Guan, Renchu; Giunchiglia, Fausto; Liang, Yanchun; Feng, Xiaoyue. - (2021), pp. 8142-8152. (Intervento presentato al convegno EMNLP tenutosi a Online and Punta Cana, Dominican Republic nel 7-11 November 2021) [10.18653/v1/2021.emnlp-main.642].
File in questo prodotto:
File Dimensione Formato  
2021.emnlp-main.642.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 639.25 kB
Formato Adobe PDF
639.25 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/443990
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 26
  • OpenAlex ND
social impact