Knowledge Graph Completion
Knowledge graph completion (KGC) aims to infer missing relationships within knowledge graphs, improving their completeness and utility. Current research emphasizes integrating diverse knowledge sources, such as common sense reasoning, external ontologies, and large language models (LLMs), into KGC models, often employing graph neural networks, transformer architectures, and embedding methods. These advancements enhance the accuracy and efficiency of KGC, impacting various applications including question answering, recommendation systems, and risk assessment in cybersecurity. Furthermore, there's a growing focus on improving the interpretability and trustworthiness of KGC models, addressing the "black box" nature of many existing approaches.
Papers
BESS: Balanced Entity Sampling and Sharing for Large-Scale Knowledge Graph Completion
Alberto Cattaneo, Daniel Justus, Harry Mellor, Douglas Orr, Jerome Maloberti, Zhenying Liu, Thorin Farnsworth, Andrew Fitzgibbon, Blazej Banaszewski, Carlo Luschi
Relation-dependent Contrastive Learning with Cluster Sampling for Inductive Relation Prediction
Jianfeng Wu, Sijie Mai, Haifeng Hu
Joint Multilingual Knowledge Graph Completion and Alignment
Vinh Tong, Dat Quoc Nguyen, Trung Thanh Huynh, Tam Thanh Nguyen, Quoc Viet Hung Nguyen, Mathias Niepert
MoSE: Modality Split and Ensemble for Multimodal Knowledge Graph Completion
Yu Zhao, Xiangrui Cai, Yike Wu, Haiwei Zhang, Ying Zhang, Guoqing Zhao, Ning Jiang