Multilingual Knowledge

Multilingual knowledge research focuses on enabling artificial intelligence models to effectively understand and utilize information across multiple languages, aiming to overcome the limitations of English-centric models. Current efforts concentrate on improving multilingual large language models (LLMs) through techniques like mixture-of-experts architectures, knowledge distillation, and targeted neuron manipulation for knowledge editing and cross-lingual transfer. This research is crucial for bridging the language gap in information access and processing, impacting fields like machine translation, question answering, and cross-lingual knowledge graph completion, ultimately fostering greater inclusivity and accessibility of information globally.

Papers