Orthogonal Parameterization
Orthogonal parameterization is a technique in machine learning that uses orthogonal matrices to efficiently update model parameters, primarily focusing on adapting pre-trained models to new tasks or datasets without extensive retraining. Current research emphasizes applications in various domains, including language models, knowledge graph embeddings, and image generation, employing methods like Householder reflections and butterfly factorizations to improve efficiency and scalability. This approach offers significant advantages in parameter-efficient fine-tuning, enabling the adaptation of large, computationally expensive models to diverse downstream tasks while mitigating issues like bias and privacy leakage.
Papers
November 5, 2024
June 14, 2024
May 14, 2024
March 13, 2024
November 10, 2023