Deep Declarative
Deep Declarative Networks (DDNs) integrate mathematical optimization problems directly into neural networks, enabling end-to-end learning of complex tasks involving optimization subroutines. Current research focuses on developing efficient and stable backpropagation methods through various optimization solvers, including those based on implicit differentiation and bi-level optimization, and exploring ways to exploit problem structure for computational efficiency. This approach offers significant advantages in applications requiring differentiable solutions to optimization problems, such as time series alignment and robust feature extraction, leading to improved model accuracy and reduced computational cost.
Papers
April 27, 2024
June 26, 2023
June 24, 2023
March 19, 2023