Functional Constrained Optimization
Functional constrained optimization focuses on finding optimal solutions while satisfying specific constraints, often involving functions rather than simple variables. Current research emphasizes efficient algorithms, such as variations of gradient descent and proximal methods, often applied within frameworks like Alternating Direction Method of Multipliers (ADMM) or Stein Variational Gradient Descent (SVGD), and incorporating techniques like neural networks for function approximation. This field is crucial for addressing diverse problems across machine learning, including fair machine learning, trajectory optimization, and risk-averse resource allocation, where balancing competing objectives under constraints is paramount.
Papers
September 25, 2024
April 18, 2024
April 15, 2024
November 3, 2023
August 23, 2023
November 7, 2022
October 28, 2022
October 11, 2022