Adapter Module
Adapter modules are lightweight, parameter-efficient components added to pre-trained models to adapt them to new tasks or domains without retraining the entire model. Current research focuses on improving their performance, particularly in comparison to full fine-tuning, exploring various architectures like low-rank adapters (LoRA) and mixtures of adapters (MoA), and optimizing their integration into diverse model types, including transformers and neural cellular automata. This approach significantly reduces computational costs and memory requirements for tasks such as multilingual language processing, image recognition, and speech synthesis, making large model adaptation more accessible and efficient.
Papers
Asymmetry in Low-Rank Adapters of Foundation Models
Jiacheng Zhu, Kristjan Greenewald, Kimia Nadjahi, Haitz Sáez de Ocáriz Borde, Rickard Brüel Gabrielsson, Leshem Choshen, Marzyeh Ghassemi, Mikhail Yurochkin, Justin Solomon
Training Neural Networks from Scratch with Parallel Low-Rank Adapters
Minyoung Huh, Brian Cheung, Jeremy Bernstein, Phillip Isola, Pulkit Agrawal
Efficient Fine-tuning of Audio Spectrogram Transformers via Soft Mixture of Adapters
Umberto Cappellazzo, Daniele Falavigna, Alessio Brutti
AnimateLCM: Computation-Efficient Personalized Style Video Generation without Personalized Video Data
Fu-Yun Wang, Zhaoyang Huang, Weikang Bian, Xiaoyu Shi, Keqiang Sun, Guanglu Song, Yu Liu, Hongsheng Li