Cross Lingual Understanding
Cross-lingual understanding (CLU) focuses on enabling artificial intelligence systems to comprehend and process information across multiple languages, aiming to overcome the limitations of monolingual models. Current research emphasizes improving the performance of large multilingual language models (LLMs) and exploring techniques like prompt tuning as more efficient alternatives to traditional fine-tuning for cross-lingual transfer. This field is crucial for bridging language barriers in various applications, including multilingual information retrieval, hate speech detection, and education, and for advancing our understanding of how language models learn and transfer knowledge across different linguistic structures.
Papers
May 24, 2024
March 14, 2024
September 24, 2023
January 16, 2023
December 4, 2022
October 22, 2022