Non Monotonic Activation Function

Non-monotonic activation functions, unlike their monotonic counterparts (e.g., ReLU), introduce non-linearity with both increasing and decreasing regions, potentially enhancing neural network performance. Current research focuses on designing novel non-monotonic functions, such as variations of Swish and GELU, and understanding their impact on training dynamics and generalization, particularly within the context of specific architectures like Input-Convex Neural Networks (ICNNs). These investigations aim to improve model accuracy, efficiency, and explainability across various applications, including image classification and drug discovery, by leveraging the unique properties of non-monotonic activation functions.

Papers