Hidden CoST
Hidden CoST research focuses on optimizing the trade-off between the performance and resource consumption of various computational models and algorithms. Current efforts concentrate on developing cost-effective alternatives to expensive models like GPT-4, exploring efficient architectures for specific applications (e.g., IoT security, automatic speech recognition), and improving the efficiency of existing methods through techniques such as active learning, ensemble selection, and early stopping. This work is significant because it addresses the critical need for resource-efficient solutions in diverse fields, ranging from AI model training and deployment to resource-constrained IoT devices and automated machine learning.
Papers
The Cost of Consistency: Submodular Maximization with Constant Recourse
Paul Dütting, Federico Fusco, Silvio Lattanzi, Ashkan Norouzi-Fard, Ola Svensson, Morteza Zadimoghaddam
The Problem of Social Cost in Multi-Agent General Reinforcement Learning: Survey and Synthesis
Kee Siong Ng, Samuel Yang-Zhao, Timothy Cadogan-Cowper
Leveraging Large Language Models to Democratize Access to Costly Financial Datasets for Academic Research
Julian Junyan Wang, Victor Xiaoqi Wang