Low Resourced Language
Low-resource language (LRL) research focuses on developing natural language processing (NLP) techniques for languages lacking extensive digital resources, aiming to bridge the technological gap between well-resourced and under-resourced languages. Current research emphasizes leveraging large language models (LLMs), often through techniques like prompt engineering, transfer learning from related high-resource languages, and the creation of smaller, more efficient models tailored to LRLs. This work is crucial for promoting linguistic diversity in AI, enabling access to technology for underrepresented communities, and advancing our understanding of multilingual NLP.
Papers
July 31, 2024
July 18, 2024
July 16, 2024
June 26, 2024
June 25, 2024
June 21, 2024
June 6, 2024
May 26, 2024
May 20, 2024
May 7, 2024
April 26, 2024
November 9, 2023
September 29, 2023
July 26, 2023
December 19, 2022
December 7, 2022
August 9, 2022
May 25, 2022
April 12, 2022