Paper ID: 2408.05283
MUSE: Multi-Knowledge Passing on the Edges, Boosting Knowledge Graph Completion
Pengjie Liu
Knowledge Graph Completion (KGC) aims to predict the missing information in the (head entity)-[relation]-(tail entity) triplet. Deep Neural Networks have achieved significant progress in the relation prediction task. However, most existing KGC methods focus on single features (e.g., entity IDs) and sub-graph aggregation, which cannot fully explore all the features in the Knowledge Graph (KG), and neglect the external semantic knowledge injection. To address these problems, we propose MUSE, a knowledge-aware reasoning model to learn a tailored embedding space in three dimensions for missing relation prediction through a multi-knowledge representation learning mechanism. Our MUSE consists of three parallel components: 1) Prior Knowledge Learning for enhancing the triplets' semantic representation by fine-tuning BERT; 2) Context Message Passing for enhancing the context messages of KG; 3) Relational Path Aggregation for enhancing the path representation from the head entity to the tail entity. Our experimental results show that MUSE significantly outperforms other baselines on four public datasets, such as over 5.50% improvement in H@1 and 4.20% improvement in MRR on the NELL995 dataset. The code and all datasets will be released via https://github.com/NxxTGT/MUSE.
Submitted: Aug 9, 2024