Cross Lingual Retrieval

Cross-lingual retrieval (CLR) aims to find relevant information in one language given a query in another, bridging the language barrier in information access. Current research focuses on improving the performance of multilingual language models (like BERT and XLM-R) for CLR, often employing techniques like contrastive learning, self-supervised training, and multi-stage retrieval architectures to enhance accuracy, especially for low-resource languages. These advancements are significant because they enable more effective cross-lingual search and information sharing, impacting fields like fact-checking, multilingual question answering, and cross-cultural understanding.

Papers