Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    DPAL-BERT: A Faster and Lighter Question Answering Model

    Lirong Yin1, Lei Wang1, Zhuohang Cai2, Siyu Lu2,*, Ruiyang Wang2, Ahmed AlSanad3, Salman A. AlQahtani3, Xiaobing Chen4, Zhengtong Yin5, Xiaolu Li6, Wenfeng Zheng2,3,*

    CMES-Computer Modeling in Engineering & Sciences, Vol.141, No.1, pp. 771-786, 2024, DOI:10.32604/cmes.2024.052622 - 20 August 2024

    Abstract Recent advancements in natural language processing have given rise to numerous pre-training language models in question-answering systems. However, with the constant evolution of algorithms, data, and computing power, the increasing size and complexity of these models have led to increased training costs and reduced efficiency. This study aims to minimize the inference time of such models while maintaining computational performance. It also proposes a novel Distillation model for PAL-BERT (DPAL-BERT), specifically, employs knowledge distillation, using the PAL-BERT model as the teacher model to train two student models: DPAL-BERT-Bi and DPAL-BERT-C. This research enhances the dataset More >

Displaying 1-10 on page 1 of 1. Per Page