Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (6)
  • Open Access

    ARTICLE

    Topic-Aware Abstractive Summarization Based on Heterogeneous Graph Attention Networks for Chinese Complaint Reports

    Yan Li1, Xiaoguang Zhang1,*, Tianyu Gong1, Qi Dong1, Hailong Zhu1, Tianqiang Zhang1, Yanji Jiang2,3

    CMC-Computers, Materials & Continua, Vol.76, No.3, pp. 3691-3705, 2023, DOI:10.32604/cmc.2023.040492 - 08 October 2023

    Abstract Automatic text summarization (ATS) plays a significant role in Natural Language Processing (NLP). Abstractive summarization produces summaries by identifying and compressing the most important information in a document. However, there are only relatively several comprehensively evaluated abstractive summarization models that work well for specific types of reports due to their unstructured and oral language text characteristics. In particular, Chinese complaint reports, generated by urban complainers and collected by government employees, describe existing resident problems in daily life. Meanwhile, the reflected problems are required to respond speedily. Therefore, automatic summarization tasks for these reports have been More >

  • Open Access

    ARTICLE

    Weakly Supervised Abstractive Summarization with Enhancing Factual Consistency for Chinese Complaint Reports

    Ren Tao, Chen Shuang*

    CMC-Computers, Materials & Continua, Vol.75, No.3, pp. 6201-6217, 2023, DOI:10.32604/cmc.2023.036178 - 29 April 2023

    Abstract A large variety of complaint reports reflect subjective information expressed by citizens. A key challenge of text summarization for complaint reports is to ensure the factual consistency of generated summary. Therefore, in this paper, a simple and weakly supervised framework considering factual consistency is proposed to generate a summary of city-based complaint reports without pre-labeled sentences/words. Furthermore, it considers the importance of entity in complaint reports to ensure factual consistency of summary. Experimental results on the customer review datasets (Yelp and Amazon) and complaint report dataset (complaint reports of Shenyang in China) show that the More >

  • Open Access

    ARTICLE

    Ext-ICAS: A Novel Self-Normalized Extractive Intra Cosine Attention Similarity Summarization

    P. Sharmila1,*, C. Deisy1, S. Parthasarathy2

    Computer Systems Science and Engineering, Vol.45, No.1, pp. 377-393, 2023, DOI:10.32604/csse.2023.027481 - 16 August 2022

    Abstract With the continuous growth of online news articles, there arises the necessity for an efficient abstractive summarization technique for the problem of information overloading. Abstractive summarization is highly complex and requires a deeper understanding and proper reasoning to come up with its own summary outline. Abstractive summarization task is framed as seq2seq modeling. Existing seq2seq methods perform better on short sequences; however, for long sequences, the performance degrades due to high computation and hence a two-phase self-normalized deep neural document summarization model consisting of improvised extractive cosine normalization and seq2seq abstractive phases has been proposed… More >

  • Open Access

    ARTICLE

    An Intelligent Tree Extractive Text Summarization Deep Learning

    Abeer Abdulaziz AlArfaj, Hanan Ahmed Hosni Mahmoud*

    CMC-Computers, Materials & Continua, Vol.73, No.2, pp. 4231-4244, 2022, DOI:10.32604/cmc.2022.030090 - 16 June 2022

    Abstract In recent research, deep learning algorithms have presented effective representation learning models for natural languages. The deep learning-based models create better data representation than classical models. They are capable of automated extraction of distributed representation of texts. In this research, we introduce a new tree Extractive text summarization that is characterized by fitting the text structure representation in knowledge base training module, and also addresses memory issues that were not addresses before. The proposed model employs a tree structured mechanism to generate the phrase and text embedding. The proposed architecture mimics the tree configuration of… More >

  • Open Access

    ARTICLE

    A Semantic Supervision Method for Abstractive Summarization

    Sunqiang Hu1, Xiaoyu Li1, Yu Deng1,*, Yu Peng1, Bin Lin2, Shan Yang3

    CMC-Computers, Materials & Continua, Vol.69, No.1, pp. 145-158, 2021, DOI:10.32604/cmc.2021.017441 - 04 June 2021

    Abstract In recent years, many text summarization models based on pre-training methods have achieved very good results. However, in these text summarization models, semantic deviations are easy to occur between the original input representation and the representation that passed multi-layer encoder, which may result in inconsistencies between the generated summary and the source text content. The Bidirectional Encoder Representations from Transformers (BERT) improves the performance of many tasks in Natural Language Processing (NLP). Although BERT has a strong capability to encode context, it lacks the fine-grained semantic representation. To solve these two problems, we proposed a… More >

  • Open Access

    ARTICLE

    An Abstractive Summarization Technique with Variable Length Keywords as per Document Diversity

    Muhammad Yahya Saeed1, Muhammad Awais1, Muhammad Younas1, Muhammad Arif Shah2,*, Atif Khan3, M. Irfan Uddin4, Marwan Mahmoud5

    CMC-Computers, Materials & Continua, Vol.66, No.3, pp. 2409-2423, 2021, DOI:10.32604/cmc.2021.014330 - 28 December 2020

    Abstract Text Summarization is an essential area in text mining, which has procedures for text extraction. In natural language processing, text summarization maps the documents to a representative set of descriptive words. Therefore, the objective of text extraction is to attain reduced expressive contents from the text documents. Text summarization has two main areas such as abstractive, and extractive summarization. Extractive text summarization has further two approaches, in which the first approach applies the sentence score algorithm, and the second approach follows the word embedding principles. All such text extractions have limitations in providing the basic… More >

Displaying 1-10 on page 1 of 6. Per Page