Domain Specific Adapter
Domain-specific adapters are lightweight modules added to pre-trained models to efficiently adapt them to new domains without retraining the entire model. Current research focuses on developing effective methods for selecting, combining, and training these adapters, including techniques like low-rank adaptation (LoRA), mixture-of-experts models, and weight averaging across multiple adapters. This approach offers significant advantages in terms of computational efficiency and resource management, impacting various fields by enabling the customization of large language models, image generation models, and other deep learning architectures for specialized tasks with limited data.
Papers
October 11, 2024
June 24, 2024
March 4, 2024
February 16, 2024
August 20, 2023
July 6, 2023
July 5, 2023
June 8, 2023
February 14, 2023