Sparse Penalty

Sparse penalty methods aim to identify the most relevant variables in high-dimensional datasets by shrinking less important coefficients to zero, improving model interpretability and efficiency. Current research focuses on developing and refining algorithms like smoothing proximal gradient and alternating direction method of multipliers (ADMM), often within a federated or decentralized learning framework, to efficiently handle non-convex penalties and large-scale datasets, particularly in quantile regression. These advancements are crucial for analyzing data from sources like the Internet of Things (IoT) and improving the accuracy and speed of model fitting in various fields, including genetics and data analysis. Furthermore, research explores efficient feature reduction techniques to mitigate the computational cost associated with these methods.

Papers