Adapter Module
Adapter modules are lightweight, parameter-efficient components added to pre-trained models to adapt them to new tasks or domains without retraining the entire model. Current research focuses on improving their performance, particularly in comparison to full fine-tuning, exploring various architectures like low-rank adapters (LoRA) and mixtures of adapters (MoA), and optimizing their integration into diverse model types, including transformers and neural cellular automata. This approach significantly reduces computational costs and memory requirements for tasks such as multilingual language processing, image recognition, and speech synthesis, making large model adaptation more accessible and efficient.
Papers
I2I: Initializing Adapters with Improvised Knowledge
Tejas Srinivasan, Furong Jia, Mohammad Rostami, Jesse Thomason
LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models
Zhiqiang Hu, Lei Wang, Yihuai Lan, Wanyu Xu, Ee-Peng Lim, Lidong Bing, Xing Xu, Soujanya Poria, Roy Ka-Wei Lee