ReLU Regression
ReLU regression focuses on using rectified linear unit (ReLU) activation functions within neural networks for regression tasks, aiming to improve efficiency, accuracy, and robustness. Current research explores variations like higher-order ReLU networks and ReLU-based Kolmogorov-Arnold networks, investigating their theoretical properties and developing efficient optimization algorithms beyond standard stochastic gradient descent. These advancements contribute to a deeper understanding of neural network behavior and offer improved solutions for solving partial differential equations and other regression problems in scientific and engineering applications.
Papers
September 18, 2024
August 9, 2024
July 14, 2024
March 2, 2024
July 13, 2023
May 16, 2023
March 3, 2023
February 13, 2023
February 2, 2022
December 16, 2021