Paper ID: 2201.09145
glassoformer: a query-sparse transformer for post-fault power grid voltage prediction
Yunling Zheng, Carson Hu, Guang Lin, Meng Yue, Bao Wang, Jack Xin
We propose GLassoformer, a novel and efficient transformer architecture leveraging group Lasso regularization to reduce the number of queries of the standard self-attention mechanism. Due to the sparsified queries, GLassoformer is more computationally efficient than the standard transformers. On the power grid post-fault voltage prediction task, GLassoformer shows remarkably better prediction than many existing benchmark algorithms in terms of accuracy and stability.
Submitted: Jan 22, 2022