Open Access
ARTICLE
Joint Self-Attention Based Neural Networks for Semantic Relation Extraction
School of Computer and Information Engineering, Henan University, Kaifeng, 475000, China.
The Corporate and Investment, Bank Technology, J. P. Morgan Chase N. A. 25 Bank St, Canary Wharf, London, E145JP, United Kingdom.
*Corresponding Author: Yatian Shen. Email: .
Journal of Information Hiding and Privacy Protection 2019, 1(2), 69-75. https://doi.org/10.32604/jihpp.2019.06357
Abstract
Relation extraction is an important task in NLP community. However, some models often fail in capturing Long-distance dependence on semantics, and the interaction between semantics of two entities is ignored. In this paper, we propose a novel neural network model for semantic relation classification called joint self-attention bi-LSTM (SA-Bi-LSTM) to model the internal structure of the sentence to obtain the importance of each word of the sentence without relying on additional information, and capture Long-distance dependence on semantics. We conduct experiments using the SemEval-2010 Task 8 dataset. Extensive experiments and the results demonstrated that the proposed method is effective against relation classification, which can obtain state-of-the-art classification accuracy just with minimal feature engineering.Keywords
Cite This Article
Citations
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.