Activation Function
Activation functions are crucial components of neural networks, introducing nonlinearity to enable the learning of complex patterns from data. Current research focuses on developing novel activation functions, including those with learnable parameters, and exploring their impact within various architectures like Kolmogorov-Arnold Networks and transformers. These efforts aim to improve model performance, efficiency, and interpretability across diverse applications, from image classification and generation to solving partial differential equations and formal verification tasks. The ongoing search for optimal activation functions is driving significant advancements in the field of deep learning.
Papers
October 2, 2023
September 28, 2023
September 25, 2023
September 9, 2023
August 31, 2023
August 25, 2023
August 11, 2023
August 10, 2023
August 9, 2023
August 8, 2023
July 31, 2023
July 29, 2023
July 27, 2023
July 13, 2023
July 10, 2023
July 2, 2023
June 28, 2023
June 14, 2023