Paper ID: 2410.12032

MLPerf Power: Benchmarking the Energy Efficiency of Machine Learning Systems from μWatts to MWatts for Sustainable AI

Arya Tschand (1), Arun Tejusve Raghunath Rajan (2), Sachin Idgunji (3), Anirban Ghosh (3), Jeremy Holleman (4), Csaba Kiraly (5), Pawan Ambalkar (6), Ritika Borkar (3), Ramesh Chukka (7), Trevor Cockrell (6), Oliver Curtis (8), Grigori Fursin (9), Miro Hodak (10), Hiwot Kassa (11), Anton Lokhmotov (12), Dejan Miskovic (3), Yuechao Pan (13), Manu Prasad Manmathan (7), Liz Raymond (6), Tom St. John (14), Arjun Suresh (15), Rowan Taubitz (8), Sean Zhan (8), Scott Wasson (16), David Kanter (16), Vijay Janapa Reddi (1) ((1) Harvard University, (2) Self / Meta, (3) NVIDIA, (4) UNC Charlotte / Syntiant, (5) Codex, (6) Dell, (7) Intel, (8) SMC, (9) FlexAI / cTuning, (10) AMD, (11) Meta, (12) KRAI, (13) Google, (14) Decompute, (15) GATE Overflow, (16) MLCommons)

Rapid adoption of machine learning (ML) technologies has led to a surge in power consumption across diverse systems, from tiny IoT devices to massive datacenter clusters. Benchmarking the energy efficiency of these systems is crucial for optimization, but presents novel challenges due to the variety of hardware platforms, workload characteristics, and system-level interactions. This paper introduces MLPerf Power, a comprehensive benchmarking methodology with capabilities to evaluate the energy efficiency of ML systems at power levels ranging from microwatts to megawatts. Developed by a consortium of industry professionals from more than 20 organizations, MLPerf Power establishes rules and best practices to ensure comparability across diverse architectures. We use representative workloads from the MLPerf benchmark suite to collect 1,841 reproducible measurements from 60 systems across the entire range of ML deployment scales. Our analysis reveals trade-offs between performance, complexity, and energy efficiency across this wide range of systems, providing actionable insights for designing optimized ML solutions from the smallest edge devices to the largest cloud infrastructures. This work emphasizes the importance of energy efficiency as a key metric in the evaluation and comparison of the ML system, laying the foundation for future research in this critical area. We discuss the implications for developing sustainable AI solutions and standardizing energy efficiency benchmarking for ML systems.

Submitted: Oct 15, 2024