Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (1)
  • Open Access

    ARTICLE

    Joint Self-Attention Based Neural Networks for Semantic Relation Extraction

    Jun Sun1, Yan Li1, Yatian Shen1,*, Wenke Ding1, Xianjin Shi1, Lei Zhang1, Xiajiong Shen1, Jing He2

    Journal of Information Hiding and Privacy Protection, Vol.1, No.2, pp. 69-75, 2019, DOI:10.32604/jihpp.2019.06357

    Abstract Relation extraction is an important task in NLP community. However, some models often fail in capturing Long-distance dependence on semantics, and the interaction between semantics of two entities is ignored. In this paper, we propose a novel neural network model for semantic relation classification called joint self-attention bi-LSTM (SA-Bi-LSTM) to model the internal structure of the sentence to obtain the importance of each word of the sentence without relying on additional information, and capture Long-distance dependence on semantics. We conduct experiments using the SemEval-2010 Task 8 dataset. Extensive experiments and the results demonstrated that the More >

Displaying 1-10 on page 1 of 1. Per Page