Regularization Path
Regularization paths trace the solutions of a model as a regularization parameter varies, revealing how model complexity and solution sparsity are intertwined. Current research focuses on efficiently computing these paths for various models, including linear regression (LASSO, elastic net), deep neural networks, and optimal transport problems, often employing algorithms like LARS and block-coordinate descent. Understanding regularization paths provides insights into model behavior, facilitates hyperparameter tuning, and enables the development of more efficient and interpretable machine learning algorithms. This knowledge is crucial for improving model generalization and sparsity, leading to more efficient and robust applications.
Papers
August 28, 2024
August 21, 2024
May 14, 2024
April 29, 2024
April 4, 2024
December 21, 2023
November 30, 2023
August 23, 2023
June 23, 2023
April 2, 2023
March 31, 2023