ReLU Function
The rectified linear unit (ReLU) function, a simple yet powerful activation function in neural networks, is a subject of ongoing research focused on understanding its properties and improving its application. Current research investigates ReLU's impact on network injectivity, its role in various algorithms (e.g., for blind image deblurring and stochastic bandits), and its optimization for efficient and secure inference, including methods to reduce the number of ReLU operations. These efforts aim to enhance the theoretical understanding of ReLU networks and improve their performance in diverse applications, from image processing to machine learning algorithms.
Papers
June 22, 2024
June 12, 2024
May 12, 2024
October 2, 2023
May 31, 2023
May 15, 2023
April 26, 2023
April 20, 2023
March 28, 2023
February 20, 2023
January 23, 2023