Soft Equivariance

Soft equivariance in neural networks aims to leverage the benefits of symmetry-based inductive biases while mitigating the limitations of strictly enforcing equivariance, which can hinder model flexibility and performance when symmetries are approximate or broken. Current research focuses on developing methods to learn the degree of equivariance from data, using techniques like regularizers, differentiable approximations, and architectural modifications such as relaxed group convolutions and residual pathway priors. This allows for more adaptable models that can handle real-world data with imperfect or mixed symmetries, leading to improved generalization and efficiency in various applications including image classification, object detection, and dynamical systems modeling.

Papers