Rectified Linear Unit
Rectified Linear Units (ReLUs) are a fundamental activation function in artificial neural networks, crucial for enabling efficient training and powerful representation capabilities. Current research focuses on understanding and improving ReLU's properties, including addressing its limitations in adversarial robustness and exploring its role in various architectures like deep survival models (SurvReLU) and efficient private inference networks (xMLP). This work is significant because it enhances both the theoretical understanding of neural network behavior and the practical performance of applications ranging from medical prognosis to large-scale recommendation systems, improving accuracy, efficiency, and interpretability.
27papers
Papers
March 21, 2025
January 26, 2025
November 15, 2024
September 17, 2024
July 19, 2024
June 9, 2024
March 12, 2024
December 27, 2023
July 25, 2023
April 12, 2023
April 10, 2023
March 6, 2023