Zhen-Yu Chen1, Feng-Chi Liu2, Xin Wang3, Cheng-Hsiung Lee1, Ching-Sheng Lin1,*
CMC-Computers, Materials & Continua, Vol.82, No.3, pp. 4287-4300, 2025, DOI:10.32604/cmc.2025.061661
- 06 March 2025
Abstract In the domain of knowledge graph embedding, conventional approaches typically transform entities and relations into continuous vector spaces. However, parameter efficiency becomes increasingly crucial when dealing with large-scale knowledge graphs that contain vast numbers of entities and relations. In particular, resource-intensive embeddings often lead to increased computational costs, and may limit scalability and adaptability in practical environments, such as in low-resource settings or real-world applications. This paper explores an approach to knowledge graph representation learning that leverages small, reserved entities and relation sets for parameter-efficient embedding. We introduce a hierarchical attention network designed to refine More >