Cross Lingual
Cross-lingual research focuses on bridging language barriers in natural language processing, aiming to build models that understand and process text across multiple languages. Current efforts concentrate on improving multilingual large language models (LLMs) through techniques like continual pre-training, adapter modules, and contrastive learning, often addressing challenges related to low-resource languages and semantic alignment. This field is crucial for expanding access to NLP technologies globally and enabling cross-cultural communication and information exchange in diverse applications, such as machine translation, sentiment analysis, and cross-lingual information retrieval.
Papers
Data-adaptive Transfer Learning for Translation: A Case Study in Haitian and Jamaican
Nathaniel R. Robinson, Cameron J. Hogan, Nancy Fulda, David R. Mortensen
Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching
Kunbo Ding, Weijie Liu, Yuejian Fang, Zhe Zhao, Qi Ju, Xuefeng Yang
Cross-lingual Approaches for the Detection of Adverse Drug Reactions in German from a Patient's Perspective
Lisa Raithel, Philippe Thomas, Roland Roller, Oliver Sapina, Sebastian Möller, Pierre Zweigenbaum
Cross-Lingual Knowledge Transfer for Clinical Phenotyping
Jens-Michalis Papaioannou, Paul Grundmann, Betty van Aken, Athanasios Samaras, Ilias Kyparissidis, George Giannakoulas, Felix Gers, Alexander Löser