High Resource Language
High-resource language (HRL) research focuses on addressing the significant performance gap between natural language processing (NLP) models trained on dominant languages (like English and Chinese) and those trained on low-resource languages (LRLs). Current research emphasizes developing and adapting large language models (LLMs), often employing techniques like multilingual fine-tuning, transfer learning from HRLs, and data augmentation strategies to improve performance on LRLs across various tasks such as machine translation, question answering, and sentiment analysis. This work is crucial for promoting linguistic diversity and inclusivity in AI, ensuring equitable access to advanced language technologies for all speakers globally.
Papers
July 6, 2024
July 2, 2024
July 1, 2024
June 21, 2024
June 20, 2024
June 19, 2024
June 14, 2024
June 11, 2024
June 5, 2024
May 20, 2024
May 8, 2024
April 28, 2024
April 25, 2024
April 22, 2024
April 17, 2024
April 14, 2024
March 29, 2024
March 26, 2024
March 18, 2024