Zero Shot Cross Lingual Transfer
Zero-shot cross-lingual transfer aims to enable language models trained on one language to perform tasks in other languages without additional training data. Current research focuses on improving this transfer by enhancing multilingual alignment within pre-trained models (like mBERT, XLM-R, and Whisper), employing techniques such as layer swapping, data augmentation (e.g., back-parsing), and parameter-efficient fine-tuning. These advancements are significant because they address the scarcity of labeled data in many languages, facilitating the development of multilingual NLP applications and furthering our understanding of cross-lingual knowledge representation within large language models.
Papers
October 2, 2024
October 1, 2024
July 23, 2024
July 1, 2024
April 25, 2024
April 3, 2024
March 29, 2024
February 22, 2024
February 19, 2024
February 5, 2024
January 31, 2024
January 11, 2024
November 15, 2023
October 25, 2023
October 16, 2023
October 7, 2023
September 19, 2023
September 16, 2023
July 15, 2023
June 2, 2023