ReLU Regression

ReLU regression focuses on using rectified linear unit (ReLU) activation functions within neural networks for regression tasks, aiming to improve efficiency, accuracy, and robustness. Current research explores variations like higher-order ReLU networks and ReLU-based Kolmogorov-Arnold networks, investigating their theoretical properties and developing efficient optimization algorithms beyond standard stochastic gradient descent. These advancements contribute to a deeper understanding of neural network behavior and offer improved solutions for solving partial differential equations and other regression problems in scientific and engineering applications.

Papers