Representational Geometry

Representational geometry investigates how neural networks and brains encode information, focusing on the geometric and topological structure of learned representations. Current research explores how network architecture (e.g., activation functions like ReLU and Tanh), learning objectives (e.g., contrastive loss), and data augmentation strategies shape this geometry, impacting downstream task performance and model interpretability. These analyses, often employing techniques like representational similarity analysis and topological data analysis, aim to bridge the gap between model behavior and cognitive processes, offering insights into both artificial and biological intelligence.

Papers