Functional Constrained Optimization

Functional constrained optimization focuses on finding optimal solutions while satisfying specific constraints, often involving functions rather than simple variables. Current research emphasizes efficient algorithms, such as variations of gradient descent and proximal methods, often applied within frameworks like Alternating Direction Method of Multipliers (ADMM) or Stein Variational Gradient Descent (SVGD), and incorporating techniques like neural networks for function approximation. This field is crucial for addressing diverse problems across machine learning, including fair machine learning, trajectory optimization, and risk-averse resource allocation, where balancing competing objectives under constraints is paramount.

Papers