Subspace Approximation
Subspace approximation aims to find a low-dimensional representation of high-dimensional data that minimizes the error in representing the original data. Current research heavily focuses on developing efficient algorithms, particularly coreset constructions, to achieve this approximation, often leveraging techniques like leverage score sampling and stochastic gradient descent, with a strong emphasis on extending these methods to various $\ell_p$ norms beyond the commonly used $\ell_2$ norm. These advancements are significant for improving the scalability and efficiency of numerous machine learning tasks, including regression and dimensionality reduction, and have implications for distributed computing environments.
Papers
July 3, 2024
June 4, 2024
August 6, 2023
December 8, 2022