Open Access iconOpen Access

ARTICLE

crossmark

Lexicalized Dependency Paths Based Supervised Learning for Relation Extraction

by Huiyu Sun*, Ralph Grishman

New York University, New York, 10012, USA

* Corresponding Author: Huiyu Sun. Email: email

Computer Systems Science and Engineering 2022, 43(3), 861-870. https://doi.org/10.32604/csse.2022.030759

Abstract

Log-linear models and more recently neural network models used for supervised relation extraction requires substantial amounts of training data and time, limiting the portability to new relations and domains. To this end, we propose a training representation based on the dependency paths between entities in a dependency tree which we call lexicalized dependency paths (LDPs). We show that this representation is fast, efficient and transparent. We further propose representations utilizing entity types and its subtypes to refine our model and alleviate the data sparsity problem. We apply lexicalized dependency paths to supervised learning using the ACE corpus and show that it can achieve similar performance level to other state-of-the-art methods and even surpass them on several categories.

Keywords


Cite This Article

APA Style
Sun, H., Grishman, R. (2022). Lexicalized dependency paths based supervised learning for relation extraction. Computer Systems Science and Engineering, 43(3), 861-870. https://doi.org/10.32604/csse.2022.030759
Vancouver Style
Sun H, Grishman R. Lexicalized dependency paths based supervised learning for relation extraction. Comput Syst Sci Eng. 2022;43(3):861-870 https://doi.org/10.32604/csse.2022.030759
IEEE Style
H. Sun and R. Grishman, “Lexicalized Dependency Paths Based Supervised Learning for Relation Extraction,” Comput. Syst. Sci. Eng., vol. 43, no. 3, pp. 861-870, 2022. https://doi.org/10.32604/csse.2022.030759



cc Copyright © 2022 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2663

    View

  • 1655

    Download

  • 0

    Like

Share Link