High Efficiency
High efficiency in various computational domains is a central research theme, aiming to minimize resource consumption (time, memory, energy) while maintaining or improving performance. Current efforts focus on developing novel algorithms and architectures, such as optimized Thompson sampling for reinforcement learning, sparse attention mechanisms for transformers, and efficient model compression techniques, to achieve this goal across diverse applications including natural language processing, computer vision, and robotics. These advancements are crucial for deploying complex AI models on resource-constrained devices and for accelerating scientific discovery in data-intensive fields.
Papers
Towards Resilient and Efficient LLMs: A Comparative Study of Efficiency, Performance, and Adversarial Robustness
Xiaojing Fan, Chunliang Tao
Differentially Private Data Release on Graphs: Inefficiencies and Unfairness
Ferdinando Fioretto, Diptangshu Sen, Juba Ziani
Efficient and Accurate Pneumonia Detection Using a Novel Multi-Scale Transformer Approach
Alireza Saber, Pouria Parhami, Alimihammad Siahkarzadeh, Amirreza Fateh
Trustworthy Image Semantic Communication with GenAI: Explainablity, Controllability, and Efficiency
Xijun Wang, Dongshan Ye, Chenyuan Feng, Howard H. Yang, Xiang Chen, Tony Q. S. Quek
FacialPulse: An Efficient RNN-based Depression Detection via Temporal Facial Landmarks
Ruiqi Wang, Jinyang Huang, Jie Zhang, Xin Liu, Xiang Zhang, Zhi Liu, Peng Zhao, Sigui Chen, Xiao Sun
Optimizing Numerical Estimation and Operational Efficiency in the Legal Domain through Large Language Models
Jia-Hong Huang, Chao-Chun Yang, Yixian Shen, Alessio M. Pacces, Evangelos Kanoulas
Utilizing TTS Synthesized Data for Efficient Development of Keyword Spotting Model
Hyun Jin Park, Dhruuv Agarwal, Neng Chen, Rentao Sun, Kurt Partridge, Justin Chen, Harry Zhang, Pai Zhu, Jacob Bartel, Kyle Kastner, Gary Wang, Andrew Rosenberg, Quan Wang