Paper ID: 2408.06039
Spacetime $E(n)$-Transformer: Equivariant Attention for Spatio-temporal Graphs
Sergio G. Charles
We introduce an $E(n)$-equivariant Transformer architecture for spatio-temporal graph data. By imposing rotation, translation, and permutation equivariance inductive biases in both space and time, we show that the Spacetime $E(n)$-Transformer (SET) outperforms purely spatial and temporal models without symmetry-preserving properties. We benchmark SET against said models on the charged $N$-body problem, a simple physical system with complex dynamics. While existing spatio-temporal graph neural networks focus on sequential modeling, we empirically demonstrate that leveraging underlying domain symmetries yields considerable improvements for modeling dynamical systems on graphs.
Submitted: Aug 12, 2024