Language Based Representation
Language-based representation research focuses on encoding and utilizing natural language to improve various AI tasks, primarily aiming to bridge the gap between human-understandable instructions and machine execution. Current efforts concentrate on integrating language models with other modalities (e.g., vision, robotics) using techniques like contrastive learning and embedding language features into 3D representations, often leveraging transformer architectures. This work is significant for advancing embodied AI, improving the safety and explainability of autonomous systems, and enabling more robust and context-aware natural language processing across diverse languages and applications.
Papers
March 25, 2024
February 11, 2024
January 26, 2024
January 22, 2024
November 30, 2023
October 27, 2023
October 11, 2023