Relation Aware
Relation-aware methods aim to improve machine learning models by explicitly incorporating relationships between data elements, addressing limitations of approaches that treat data points in isolation. Current research focuses on developing novel architectures, such as graph neural networks and transformers augmented with relation encoders, to effectively capture these relationships in diverse data types, including images, text, and knowledge graphs. This focus on relational context leads to improved performance in various tasks, from object detection and scene graph generation to knowledge graph completion and natural language processing, ultimately advancing the capabilities of AI systems.
Papers
October 2, 2024
July 16, 2024
June 17, 2024
May 10, 2024
May 7, 2024
August 18, 2023
June 15, 2023
December 15, 2022