Multilingual Detection

Multilingual detection focuses on developing computational methods to identify and classify information across multiple languages, addressing challenges like disinformation, hallucination in large language models (LLMs), and the detection of specific information types (e.g., temporal expressions, employment status). Current research heavily utilizes transformer-based architectures, often incorporating techniques like adapter fusion and active learning to improve efficiency and accuracy in diverse linguistic contexts. This field is crucial for combating misinformation, enhancing the reliability of LLMs, and enabling cross-lingual information extraction for various applications, including social science research and labor market analysis.

Papers