ReLU Network
ReLU networks, a class of neural networks utilizing the rectified linear unit activation function, are a central focus in deep learning research, with current efforts concentrating on understanding their theoretical properties, improving training efficiency, and enhancing their interpretability. Research explores various aspects, including approximation capabilities, generalization behavior (especially concerning benign overfitting), and the impact of network architecture (depth, width, sparsity) on performance. These investigations are crucial for advancing both the theoretical foundations of deep learning and the development of more efficient and reliable machine learning applications across diverse fields.
Papers
May 3, 2024
April 9, 2024
April 5, 2024
March 20, 2024
March 9, 2024
March 2, 2024
February 13, 2024
January 8, 2024
December 7, 2023
November 30, 2023
November 28, 2023
November 18, 2023
November 7, 2023
November 1, 2023
October 28, 2023
October 26, 2023
October 11, 2023