Temporal Knowledge Graph Completion

Temporal Knowledge Graph Completion (TKGC) aims to predict missing facts in knowledge graphs that evolve over time, improving the accuracy and completeness of dynamic knowledge representations. Current research focuses on developing sophisticated embedding models, often employing geometric operations in various spaces (Euclidean, hyperbolic, hypercomplex) or leveraging neural architectures like graph neural networks and transformers, sometimes incorporating pre-trained language models for enhanced semantic understanding. These advancements enable more accurate predictions of missing temporal links, improving applications ranging from event forecasting to historical analysis and question answering systems. The field is also actively exploring efficient continual learning methods to handle the ever-growing and dynamic nature of temporal knowledge graphs.

Papers