High Efficiency
High efficiency in various computational domains is a central research theme, aiming to minimize resource consumption (time, memory, energy) while maintaining or improving performance. Current efforts focus on developing novel algorithms and architectures, such as optimized Thompson sampling for reinforcement learning, sparse attention mechanisms for transformers, and efficient model compression techniques, to achieve this goal across diverse applications including natural language processing, computer vision, and robotics. These advancements are crucial for deploying complex AI models on resource-constrained devices and for accelerating scientific discovery in data-intensive fields.
Papers
Neural Architecture Transfer 2: A Paradigm for Improving Efficiency in Multi-Objective Neural Architecture Search
Simone Sarti, Eugenio Lomurno, Matteo Matteucci
Efficient Visual Fault Detection for Freight Train Braking System via Heterogeneous Self Distillation in the Wild
Yang Zhang, Huilin Pan, Yang Zhou, Mingying Li, Guodong Sun
Safe, Efficient, Comfort, and Energy-saving Automated Driving through Roundabout Based on Deep Reinforcement Learning
Henan Yuan, Penghui Li, Bart van Arem, Liujiang Kang, Yongqi Dong
Exploring the Performance and Efficiency of Transformer Models for NLP on Mobile Devices
Ioannis Panopoulos, Sokratis Nikolaidis, Stylianos I. Venieris, Iakovos S. Venieris
Quality and Efficiency of Manual Annotation: Pre-annotation Bias
Marie Mikulová, Milan Straka, Jan Štěpánek, Barbora Štěpánková, Jan Hajič
Overcoming the Limitations of Localization Uncertainty: Efficient & Exact Non-Linear Post-Processing and Calibration
Moussa Kassem Sbeyti, Michelle Karg, Christian Wirth, Azarm Nowzad, Sahin Albayrak
Efficient Token-Guided Image-Text Retrieval with Consistent Multimodal Contrastive Training
Chong Liu, Yuqi Zhang, Hongsong Wang, Weihua Chen, Fan Wang, Yan Huang, Yi-Dong Shen, Liang Wang