Convergence Time

Convergence time, the duration required for an algorithm to reach a solution, is a critical performance metric across diverse machine learning applications, particularly in distributed settings like federated learning. Current research focuses on minimizing convergence time by addressing system and statistical heterogeneity through adaptive client sampling, optimizing communication protocols (e.g., one-shot methods, quantization), and leveraging techniques like extreme value theory for worst-case prediction. Improved convergence speeds are crucial for enhancing the efficiency and scalability of machine learning algorithms, impacting areas such as real-time applications, resource-constrained environments, and large-scale data analysis.

Papers