Symmetry Breaking
Symmetry breaking, the disruption of inherent symmetries in systems, is a central theme in diverse fields, from physics to machine learning, with research focusing on understanding its mechanisms and leveraging it for improved model performance and efficiency. Current research investigates symmetry breaking in neural networks, particularly exploring its impact on optimization, employing techniques like relaxed equivariance and G-biases to control and exploit symmetry in various architectures including transformers, MLP-Mixers, and graph neural networks. These advancements offer significant potential for enhancing model design, improving generalization, and accelerating training processes across numerous applications, including image recognition, physical system modeling, and constraint programming.