Relational Representation
Relational representation focuses on encoding knowledge and information by explicitly representing relationships between entities, aiming to improve machine learning models' ability to reason and generalize. Current research emphasizes developing novel model architectures, including graph neural networks and those leveraging large language models, to learn effective relational representations from diverse data sources, such as text, images, and knowledge graphs. This area is significant because improved relational reasoning capabilities are crucial for advancements in knowledge graph completion, question answering, program synthesis, and robotic planning, ultimately leading to more robust and intelligent AI systems.
Papers
HIP Network: Historical Information Passing Network for Extrapolation Reasoning on Temporal Knowledge Graph
Yongquan He, Peng Zhang, Luchen Liu, Qi Liang, Wenyuan Zhang, Chuang Zhang
From Reals to Logic and Back: Inventing Symbolic Vocabularies, Actions, and Models for Planning from Raw Data
Naman Shah, Jayesh Nagpal, Pulkit Verma, Siddharth Srivastava