Open Access
ARTICLE
ALBERT with Knowledge Graph Encoder Utilizing Semantic Similarity for Commonsense Question Answering
1 Department of Computer Science and Engineering, Seoul National University, Seoul, 08826, Korea
2 School of Computer Engineering, Hanshin University, Osan, 18101, Korea
3 Department of Intelligence and Information, Seoul National University, Seoul, 08826, Korea
* Corresponding Author: Eunchan Kim. Email:
Intelligent Automation & Soft Computing 2023, 36(1), 71-82. https://doi.org/10.32604/iasc.2023.032783
Received 29 May 2022; Accepted 29 June 2022; Issue published 29 September 2022
Abstract
Recently, pre-trained language representation models such as bidirectional encoder representations from transformers (BERT) have been performing well in commonsense question answering (CSQA). However, there is a problem that the models do not directly use explicit information of knowledge sources existing outside. To augment this, additional methods such as knowledge-aware graph network (KagNet) and multi-hop graph relation network (MHGRN) have been proposed. In this study, we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers (ALBERT) with knowledge graph information extraction technique. We also propose to applying the novel method, schema graph expansion to recent language models. Then, we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained language models and confirm that schema graph expansion is effective in some extent. Furthermore, we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.