Cross Lingual Transfer Learning
Cross-lingual transfer learning aims to leverage knowledge gained from high-resource languages to improve the performance of natural language processing (NLP) and speech technologies in low-resource languages. Current research focuses on optimizing multilingual model architectures, such as Transformers, and exploring techniques like prompt tuning and knowledge distillation to enhance cross-lingual transfer effectiveness across diverse tasks including machine translation, speech recognition, and sentiment analysis. This field is crucial for bridging the digital divide, enabling broader access to NLP and speech technologies and fostering more inclusive applications in areas like healthcare and education.
Papers
October 10, 2024
September 2, 2024
August 21, 2024
August 14, 2024
July 1, 2024
June 27, 2024
June 14, 2024
March 25, 2024
January 22, 2024
January 17, 2024
December 6, 2023
November 29, 2023
July 3, 2023
June 16, 2023
June 5, 2023
June 2, 2023
June 1, 2023
May 31, 2023
May 22, 2023
April 3, 2023