Permutation Invariance
Permutation invariance, the property that a model's output remains unchanged under reordering of inputs, is a crucial concept in machine learning, aiming to improve model efficiency, robustness, and interpretability. Current research focuses on incorporating permutation invariance into various architectures, including neural networks, tree ensembles, and graph neural networks, often through techniques like shuffling, averaging, or specialized message-passing schemes. This research is significant because it leads to more efficient models, better generalization, and improved understanding of model behavior, with applications spanning diverse fields such as drug discovery, control systems, and time series forecasting.
Papers
August 20, 2024
July 22, 2024
May 23, 2024
March 12, 2024
March 4, 2024
February 5, 2024
December 15, 2023
December 12, 2023
September 20, 2023
July 14, 2023
July 13, 2023
July 4, 2023
May 14, 2023
April 15, 2023
February 24, 2023
November 15, 2022
September 15, 2022
March 11, 2022