GPU Energy
GPU energy consumption is a critical concern, particularly given the increasing reliance on GPUs for computationally intensive tasks like AI model training and inference. Research focuses on optimizing energy efficiency through various strategies, including dynamic frequency scaling, power capping, and input data manipulation to reduce power draw without significantly impacting performance. These efforts are driven by the need to reduce the environmental impact and operational costs associated with large-scale GPU deployments, with current work demonstrating substantial energy savings (up to 75%) across diverse applications like large language model inference and deep learning training.
Papers
September 26, 2024
September 23, 2024
August 5, 2024
May 2, 2024
February 25, 2024
October 17, 2023
August 12, 2022
February 21, 2022