Representational Geometry
Representational geometry investigates how neural networks and brains encode information, focusing on the geometric and topological structure of learned representations. Current research explores how network architecture (e.g., activation functions like ReLU and Tanh), learning objectives (e.g., contrastive loss), and data augmentation strategies shape this geometry, impacting downstream task performance and model interpretability. These analyses, often employing techniques like representational similarity analysis and topological data analysis, aim to bridge the gap between model behavior and cognitive processes, offering insights into both artificial and biological intelligence.
Papers
January 24, 2024
September 20, 2023
June 24, 2023
June 13, 2023
June 9, 2023
January 8, 2023
November 28, 2022
November 19, 2022
November 16, 2022
June 22, 2022