Cross Lingual
Cross-lingual research focuses on bridging language barriers in natural language processing, aiming to build models that understand and process text across multiple languages. Current efforts concentrate on improving multilingual large language models (LLMs) through techniques like continual pre-training, adapter modules, and contrastive learning, often addressing challenges related to low-resource languages and semantic alignment. This field is crucial for expanding access to NLP technologies globally and enabling cross-cultural communication and information exchange in diverse applications, such as machine translation, sentiment analysis, and cross-lingual information retrieval.
Papers
Seamless: Multilingual Expressive and Streaming Speech Translation
Seamless Communication, Loïc Barrault, Yu-An Chung, Mariano Coria Meglioli, David Dale, Ning Dong, Mark Duppenthaler, Paul-Ambroise Duquenne, Brian Ellis, Hady Elsahar, Justin Haaheim, John Hoffman, Min-Jae Hwang, Hirofumi Inaguma, Christopher Klaiber, Ilia Kulikov, Pengwei Li, Daniel Licht, Jean Maillard, Ruslan Mavlyutov, Alice Rakotoarison, Kaushik Ram Sadagopan, Abinesh Ramakrishnan, Tuan Tran, Guillaume Wenzek, Yilin Yang, Ethan Ye, Ivan Evtimov, Pierre Fernandez, Cynthia Gao, Prangthip Hansanti, Elahe Kalbassi, Amanda Kallet, Artyom Kozhevnikov, Gabriel Mejia Gonzalez, Robin San Roman, Christophe Touret, Corinne Wong, Carleigh Wood, Bokai Yu, Pierre Andrews, Can Balioglu, Peng-Jen Chen, Marta R. Costa-jussà, Maha Elbayad, Hongyu Gong, Francisco Guzmán, Kevin Heffernan, Somya Jain, Justine Kao, Ann Lee, Xutai Ma, Alex Mourachko, Benjamin Peloquin, Juan Pino, Sravya Popuri, Christophe Ropers, Safiyyah Saleem, Holger Schwenk, Anna Sun, Paden Tomasello, Changhan Wang, Jeff Wang, Skyler Wang, Mary Williamson
FREDSum: A Dialogue Summarization Corpus for French Political Debates
Virgile Rennard, Guokan Shang, Damien Grari, Julie Hunter, Michalis Vazirgiannis