Huiyu Sun*, Ralph Grishman
Computer Systems Science and Engineering, Vol.43, No.3, pp. 861-870, 2022, DOI:10.32604/csse.2022.030759
- 09 May 2022
Abstract Log-linear models and more recently neural network models used for supervised relation extraction requires substantial amounts of training data and time, limiting the portability to new relations and domains. To this end, we propose a training representation based on the dependency paths between entities in a dependency tree which we call lexicalized dependency paths (LDPs). We show that this representation is fast, efficient and transparent. We further propose representations utilizing entity types and its subtypes to refine our model and alleviate the data sparsity problem. We apply lexicalized dependency paths to supervised learning using the More >