African Language
Research on African languages is rapidly expanding, driven by the need to develop natural language processing (NLP) tools for the continent's diverse linguistic landscape. Current efforts focus on adapting and improving existing multilingual models (like BERT and HuBERT) and creating new, smaller, more efficient models specifically for low-resource African languages, often employing techniques like self-supervised learning and data augmentation to overcome data scarcity. This work is crucial for bridging the digital divide, enabling access to technology and information in local languages, and fostering linguistic diversity within the broader NLP community.
Papers
AfriHG: News headline generation for African Languages
Toyib Ogunremi, Serah Akojenu, Anthony Soronnadi, Olubayo Adekanmbi, David Ifeoluwa Adelani
Comparative Analysis of Listwise Reranking with Large Language Models in Limited-Resource Language Contexts
Yanxin Shen, Lun Wang, Chuanqi Shi, Shaoshuai Du, Yiyi Tao, Yixian Shen, Hang Zhang