Cross Linguistic

Cross-linguistic research focuses on understanding and leveraging linguistic diversity in computational linguistics, aiming to develop language technologies that work effectively across multiple languages. Current research heavily utilizes large language models (LLMs) and explores techniques like multilingual fine-tuning, parameter-efficient methods (e.g., adapters, LoRA), and cross-lingual transfer learning to address challenges posed by low-resource languages and diverse linguistic structures. This work is crucial for broadening access to NLP tools globally and improving applications such as machine translation, information retrieval, and social media analysis across diverse cultural contexts. Furthermore, investigations into the impact of linguistic typology and cultural factors on model performance are gaining prominence.

Papers