Jianquan Ouyang1,*, Jing Zhang1, Tianming Liu2
Intelligent Automation & Soft Computing, Vol.34, No.3, pp. 1707-1723, 2022, DOI:10.32604/iasc.2022.028352
- 25 May 2022
Abstract Joint entity and relation extraction (JERE) is an important foundation for unstructured knowledge extraction in natural language processing (NLP). Thus, designing efficient algorithms for it has become a vital task. Although existing methods can efficiently extract entities and relations, their performance should be improved. In this paper, we propose a novel model called Attention and Span-based Entity and Relation Transformer (ASpERT) for JERE. First, differing from the traditional approach that only considers the last hidden layer as the feature embedding, ASpERT concatenates the attention head information of each layer with the information of the last… More >