Cross Lingual Natural Language Understanding
Cross-lingual natural language understanding (XNLU) aims to enable computers to understand and process text in multiple languages, bridging the gap between languages with abundant data and those with limited resources. Current research focuses on leveraging multilingual pre-trained language models (MPLMs), often combined with techniques like self-distillation, data augmentation (including pseudo-semantic augmentation), and prompt-based fine-tuning, to improve zero-shot and few-shot cross-lingual transfer performance across various NLU tasks. These advancements are crucial for expanding the reach of NLP applications globally, particularly benefiting low-resource languages and enabling more inclusive access to technology.
Papers
June 24, 2024
June 18, 2024
June 13, 2024
April 10, 2024
February 25, 2024
February 3, 2024
November 12, 2023
October 19, 2023
July 15, 2023
May 22, 2023
January 16, 2023
November 2, 2022
October 22, 2022
March 18, 2022