Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (3)
  • Open Access

    ARTICLE

    Research on the Classification of Digital Cultural Texts Based on ASSC-TextRCNN Algorithm

    Zixuan Guo1, Houbin Wang2, Sameer Kumar1,*, Yuanfang Chen3

    CMC-Computers, Materials & Continua, Vol.86, No.3, 2026, DOI:10.32604/cmc.2025.072064 - 12 January 2026

    Abstract With the rapid development of digital culture, a large number of cultural texts are presented in the form of digital and network. These texts have significant characteristics such as sparsity, real-time and non-standard expression, which bring serious challenges to traditional classification methods. In order to cope with the above problems, this paper proposes a new ASSC (ALBERT, SVD, Self-Attention and Cross-Entropy)-TextRCNN digital cultural text classification model. Based on the framework of TextRCNN, the Albert pre-training language model is introduced to improve the depth and accuracy of semantic embedding. Combined with the dual attention mechanism, the… More >

  • Open Access

    ARTICLE

    PAL-BERT: An Improved Question Answering Model

    Wenfeng Zheng1, Siyu Lu1, Zhuohang Cai1, Ruiyang Wang1, Lei Wang2, Lirong Yin2,*

    CMES-Computer Modeling in Engineering & Sciences, Vol.139, No.3, pp. 2729-2745, 2024, DOI:10.32604/cmes.2023.046692 - 11 March 2024

    Abstract In the field of natural language processing (NLP), there have been various pre-training language models in recent years, with question answering systems gaining significant attention. However, as algorithms, data, and computing power advance, the issue of increasingly larger models and a growing number of parameters has surfaced. Consequently, model training has become more costly and less efficient. To enhance the efficiency and accuracy of the training process while reducing the model volume, this paper proposes a first-order pruning model PAL-BERT based on the ALBERT model according to the characteristics of question-answering (QA) system and language More >

  • Open Access

    ARTICLE

    ALBERT with Knowledge Graph Encoder Utilizing Semantic Similarity for Commonsense Question Answering

    Byeongmin Choi1, YongHyun Lee1, Yeunwoong Kyung2, Eunchan Kim3,*

    Intelligent Automation & Soft Computing, Vol.36, No.1, pp. 71-82, 2023, DOI:10.32604/iasc.2023.032783 - 29 September 2022

    Abstract Recently, pre-trained language representation models such as bidirectional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, More >

Displaying 1-10 on page 1 of 3. Per Page