Layer ReLU Network
Layer ReLU networks, utilizing the rectified linear unit activation function, are a central focus in neural network research, with current efforts concentrating on understanding their approximation capabilities, optimization challenges, and generalization properties. Investigations explore various architectures, including deep and shallow networks, recurrent networks, and those incorporating techniques like Bayesian methods and quadratic formulations for specific tasks such as combinatorial optimization. These studies aim to improve both the theoretical understanding of ReLU networks and their practical application in diverse fields, ranging from solving partial differential equations to image classification and beyond.
Papers
October 29, 2024
October 21, 2024
October 8, 2024
October 2, 2024
July 4, 2024
June 27, 2024
May 8, 2024
May 3, 2024
February 6, 2024
October 12, 2023
October 11, 2023
July 24, 2023
July 1, 2023
June 30, 2023
June 10, 2023
May 25, 2023
March 2, 2023
December 23, 2022
June 2, 2022