Cross Lingual
Cross-lingual research focuses on bridging language barriers in natural language processing, aiming to build models that understand and process text across multiple languages. Current efforts concentrate on improving multilingual large language models (LLMs) through techniques like continual pre-training, adapter modules, and contrastive learning, often addressing challenges related to low-resource languages and semantic alignment. This field is crucial for expanding access to NLP technologies globally and enabling cross-cultural communication and information exchange in diverse applications, such as machine translation, sentiment analysis, and cross-lingual information retrieval.
Papers
Quantized Wasserstein Procrustes Alignment of Word Embedding Spaces
Prince O Aboagye, Yan Zheng, Michael Yeh, Junpeng Wang, Zhongfang Zhuang, Huiyuan Chen, Liang Wang, Wei Zhang, Jeff Phillips
Video Games as a Corpus: Sentiment Analysis using Fallout New Vegas Dialog
Mika Hämäläinen, Khalid Alnajjar, Thierry Poibeau
Cross-lingual Similarity of Multilingual Representations Revisited
Maksym Del, Mark Fishel
Languages You Know Influence Those You Learn: Impact of Language Characteristics on Multi-Lingual Text-to-Text Transfer
Benjamin Muller, Deepanshu Gupta, Siddharth Patwardhan, Jean-Philippe Fauconnier, David Vandyke, Sachin Agarwal