ReLU Network
ReLU networks, a class of neural networks utilizing the rectified linear unit activation function, are a central focus in deep learning research, with current efforts concentrating on understanding their theoretical properties, improving training efficiency, and enhancing their interpretability. Research explores various aspects, including approximation capabilities, generalization behavior (especially concerning benign overfitting), and the impact of network architecture (depth, width, sparsity) on performance. These investigations are crucial for advancing both the theoretical foundations of deep learning and the development of more efficient and reliable machine learning applications across diverse fields.
Papers
November 11, 2024
November 8, 2024
October 22, 2024
October 21, 2024
October 9, 2024
October 8, 2024
October 5, 2024
October 3, 2024
October 1, 2024
September 17, 2024
September 9, 2024
July 30, 2024
July 9, 2024
June 18, 2024
June 10, 2024
June 1, 2024
May 15, 2024
May 10, 2024