Communication Acceleration

Communication acceleration research focuses on optimizing the speed and efficiency of information exchange in various contexts, primarily aiming to reduce computational and communication costs in distributed systems and improve user experience. Current efforts concentrate on developing novel algorithms, such as those based on federated learning and primal-dual methods, and leveraging large language models (LLMs) to enhance predictive text entry and improve the quality of experience in text-streaming services. These advancements hold significant promise for improving the efficiency of machine learning, enabling faster and more accessible communication for individuals with motor impairments, and enhancing real-time interactive applications.

Papers