Cross Lingual Transfer
Cross-lingual transfer aims to leverage knowledge learned from high-resource languages to improve performance on low-resource languages in natural language processing tasks. Current research focuses on adapting large language models (LLMs) for cross-lingual transfer, employing techniques like model merging, data augmentation (including synthetic data generation and transliteration), and innovative training strategies such as in-context learning and continual pre-training. This research is crucial for expanding the reach of NLP to a wider range of languages, enabling applications like multilingual question answering, sentiment analysis, and code generation to benefit diverse communities globally.
Papers
March 30, 2023
March 27, 2023
March 4, 2023
March 3, 2023
March 2, 2023
February 24, 2023
February 10, 2023
January 31, 2023
January 23, 2023
January 13, 2023
December 21, 2022
December 20, 2022
December 19, 2022
December 15, 2022
December 14, 2022
December 7, 2022
December 4, 2022
November 30, 2022
November 28, 2022
November 13, 2022