Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (2)
  • Open Access

    ARTICLE

    PAL-BERT: An Improved Question Answering Model

    Wenfeng Zheng1, Siyu Lu1, Zhuohang Cai1, Ruiyang Wang1, Lei Wang2, Lirong Yin2,*

    CMES-Computer Modeling in Engineering & Sciences, Vol.139, No.3, pp. 2729-2745, 2024, DOI:10.32604/cmes.2023.046692 - 11 March 2024

    Abstract In the field of natural language processing (NLP), there have been various pre-training language models in recent years, with question answering systems gaining significant attention. However, as algorithms, data, and computing power advance, the issue of increasingly larger models and a growing number of parameters has surfaced. Consequently, model training has become more costly and less efficient. To enhance the efficiency and accuracy of the training process while reducing the model volume, this paper proposes a first-order pruning model PAL-BERT based on the ALBERT model according to the characteristics of question-answering (QA) system and language More >

  • Open Access

    ARTICLE

    ALBERT with Knowledge Graph Encoder Utilizing Semantic Similarity for Commonsense Question Answering

    Byeongmin Choi1, YongHyun Lee1, Yeunwoong Kyung2, Eunchan Kim3,*

    Intelligent Automation & Soft Computing, Vol.36, No.1, pp. 71-82, 2023, DOI:10.32604/iasc.2023.032783 - 29 September 2022

    Abstract Recently, pre-trained language representation models such as bidirectional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, More >

Displaying 1-10 on page 1 of 2. Per Page