Hessian Eigenvectors
Hessian eigenvectors, representing directions of curvature in a function's landscape, are increasingly used to analyze and improve optimization algorithms, particularly in machine learning. Current research focuses on leveraging Hessian eigenvector information for efficient distributed optimization in federated learning, developing novel algorithms like SHED and Q-SHED that reduce communication overhead and improve convergence rates. This work also extends to manifold learning, where Hessian-based methods enhance smoothing spline algorithms and improve dimensionality reduction techniques. These advancements have significant implications for training large-scale neural networks and optimizing resource-constrained distributed systems.
Papers
November 1, 2023
May 18, 2023
February 10, 2023
February 11, 2022