Chenxi Yu, Xin Li*
Intelligent Automation & Soft Computing, Vol.34, No.3, pp. 2023-2034, 2022, DOI:10.32604/iasc.2022.029447
- 25 May 2022
Abstract Machine reading comprehension (MRC) is a task in natural language comprehension. It assesses machine reading comprehension based on text reading and answering questions. Traditional attention methods typically focus on one of syntax or semantics, or integrate syntax and semantics through a manual method, leaving the model unable to fully utilize syntax and semantics for MRC tasks. In order to better understand syntactic and semantic information and improve machine reading comprehension, our study uses syntactic and semantic attention to conduct text modeling for tasks. Based on the BERT model of Transformer encoder, we separate a text… More >