High Resource Language
High-resource language (HRL) research focuses on addressing the significant performance gap between natural language processing (NLP) models trained on dominant languages (like English and Chinese) and those trained on low-resource languages (LRLs). Current research emphasizes developing and adapting large language models (LLMs), often employing techniques like multilingual fine-tuning, transfer learning from HRLs, and data augmentation strategies to improve performance on LRLs across various tasks such as machine translation, question answering, and sentiment analysis. This work is crucial for promoting linguistic diversity and inclusivity in AI, ensuring equitable access to advanced language technologies for all speakers globally.
Papers
February 19, 2024
February 12, 2024
February 4, 2024
January 30, 2024
January 27, 2024
January 23, 2024
January 22, 2024
January 15, 2024
January 13, 2024
December 17, 2023
December 12, 2023
December 1, 2023
November 17, 2023
November 15, 2023
November 14, 2023
November 10, 2023
November 9, 2023
November 1, 2023