Language Understanding
Language understanding research aims to enable computers to comprehend and process human language as effectively as humans do, focusing on tasks like natural language understanding (NLU) and generation (NLG). Current research emphasizes improving model robustness to noise, ambiguity, and biases, often employing transformer-based architectures, grammar induction techniques, and methods like retrieval-augmented generation and mixture-of-experts to enhance performance on diverse tasks. These advancements have significant implications for various applications, including improved chatbots, more effective machine translation, and enhanced accessibility for individuals with communication challenges.
Papers
On-the-fly Denoising for Data Augmentation in Natural Language Understanding
Tianqing Fang, Wenxuan Zhou, Fangyu Liu, Hongming Zhang, Yangqiu Song, Muhao Chen
MULTI3NLU++: A Multilingual, Multi-Intent, Multi-Domain Dataset for Natural Language Understanding in Task-Oriented Dialogue
Nikita Moghe, Evgeniia Razumovskaia, Liane Guillou, Ivan Vulić, Anna Korhonen, Alexandra Birch
Quirk or Palmer: A Comparative Study of Modal Verb Frameworks with Annotated Datasets
Risako Owan, Maria Gini, Dongyeop Kang
PLUE: Language Understanding Evaluation Benchmark for Privacy Policies in English
Jianfeng Chi, Wasi Uddin Ahmad, Yuan Tian, Kai-Wei Chang