Deep ReLU Network
Deep ReLU networks, characterized by their use of rectified linear unit activation functions, are a central focus in deep learning research, with current efforts concentrating on understanding their approximation capabilities, generalization properties, and optimization dynamics. Researchers are exploring various architectures, including deep operator networks and "nested" networks, and developing novel algorithms like component-based sketching to improve training efficiency and generalization performance. These investigations aim to provide a stronger theoretical foundation for the remarkable empirical success of deep ReLU networks, ultimately leading to more robust and efficient deep learning models for diverse applications.
Papers
February 2, 2023
January 30, 2023
December 23, 2022
December 14, 2022
November 25, 2022
October 31, 2022
September 27, 2022
July 19, 2022
July 15, 2022
June 27, 2022
May 26, 2022
May 19, 2022
April 20, 2022
December 17, 2021
November 30, 2021