Byeongmin Choi1, YongHyun Lee1, Yeunwoong Kyung2, Eunchan Kim3,*
Intelligent Automation & Soft Computing, Vol.36, No.1, pp. 71-82, 2023, DOI:10.32604/iasc.2023.032783
- 29 September 2022
Abstract Recently, pre-trained language representation models such as bidirectional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, More >