Open Access
ARTICLE
Enhanced Topic-Aware Summarization Using Statistical Graph Neural Networks
1 Department of Computer Science, University of Agriculture Faisalabad, Faisalabad, Punjab, 37300, Pakistan
2 School of Computing, Faculty of Technology, University of Portsmouth, Southsea, Portsmouth, PO1 2UP, UK
3 Department of Mathematics and Statistics, University of Agriculture Faisalabad, Faisalabad, Punjab, 37300, Pakistan
* Corresponding Author: Fahad Ahmad. Email:
Computers, Materials & Continua 2024, 80(2), 3221-3242. https://doi.org/10.32604/cmc.2024.053488
Received 01 May 2024; Accepted 22 July 2024; Issue published 15 August 2024
Abstract
The rapid expansion of online content and big data has precipitated an urgent need for efficient summarization techniques to swiftly comprehend vast textual documents without compromising their original integrity. Current approaches in Extractive Text Summarization (ETS) leverage the modeling of inter-sentence relationships, a task of paramount importance in producing coherent summaries. This study introduces an innovative model that integrates Graph Attention Networks (GATs) with Transformer-based Bidirectional Encoder Representations from Transformers (BERT) and Latent Dirichlet Allocation (LDA), further enhanced by Term Frequency-Inverse Document Frequency (TF-IDF) values, to improve sentence selection by capturing comprehensive topical information. Our approach constructs a graph with nodes representing sentences, words, and topics, thereby elevating the interconnectivity and enabling a more refined understanding of text structures. This model is stretched to Multi-Document Summarization (MDS) from Single-Document Summarization, offering significant improvements over existing models such as THGS-GMM and Topic-GraphSum, as demonstrated by empirical evaluations on benchmark news datasets like Cable News Network (CNN)/Daily Mail (DM) and Multi-News. The results consistently demonstrate superior performance, showcasing the model’s robustness in handling complex summarization tasks across single and multi-document contexts. This research not only advances the integration of BERT and LDA within a GATs but also emphasizes our model’s capacity to effectively manage global information and adapt to diverse summarization challenges.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.