Arabic Language
Research on the Arabic language is rapidly advancing, driven by the need to adapt natural language processing (NLP) techniques to its unique linguistic features. Current efforts focus on developing and improving Arabic language models, particularly using transformer architectures like BERT, for tasks such as question answering, sentiment analysis, and hate speech detection. These advancements are crucial for bridging the resource gap in Arabic NLP, enabling applications in diverse fields including education, healthcare, and social media analysis. The creation of large, high-quality datasets, like ArabicaQA, is also a key focus, facilitating the training and evaluation of more robust and accurate models.
Papers
November 10, 2024
November 5, 2024
October 26, 2024
October 13, 2024
September 18, 2024
July 8, 2024
June 26, 2024
June 11, 2024
March 27, 2024
March 26, 2024
March 24, 2024
March 21, 2024
December 19, 2023
December 14, 2023
December 12, 2023
December 3, 2023
October 22, 2023
August 3, 2023