Non Monotonic Activation Function
Non-monotonic activation functions, unlike their monotonic counterparts (e.g., ReLU), introduce non-linearity with both increasing and decreasing regions, potentially enhancing neural network performance. Current research focuses on designing novel non-monotonic functions, such as variations of Swish and GELU, and understanding their impact on training dynamics and generalization, particularly within the context of specific architectures like Input-Convex Neural Networks (ICNNs). These investigations aim to improve model accuracy, efficiency, and explainability across various applications, including image classification and drug discovery, by leveraging the unique properties of non-monotonic activation functions.
Papers
July 1, 2024
December 19, 2023
September 30, 2023
September 28, 2023
May 12, 2023
January 15, 2023
October 17, 2022
August 5, 2022
May 24, 2022
February 16, 2022
November 7, 2021