Approximate Curvature
Approximate curvature methods aim to leverage the benefits of second-order optimization—faster convergence and improved generalization—in large-scale machine learning problems where computing the full curvature matrix is computationally prohibitive. Current research focuses on developing efficient approximations, such as Kronecker-factored approximations (KFAC) and sketching-based methods, often applied within algorithms like Adam, SGD, and various Newton-type methods, for diverse architectures including neural networks (especially transformers and CNNs) and physics-informed neural networks. These advancements enable the application of sophisticated optimization techniques to previously intractable problems, improving training speed and model performance in areas like computer vision, deep generative models, and scientific computing.