Language Understanding
Language understanding research aims to enable computers to comprehend and process human language as effectively as humans do, focusing on tasks like natural language understanding (NLU) and generation (NLG). Current research emphasizes improving model robustness to noise, ambiguity, and biases, often employing transformer-based architectures, grammar induction techniques, and methods like retrieval-augmented generation and mixture-of-experts to enhance performance on diverse tasks. These advancements have significant implications for various applications, including improved chatbots, more effective machine translation, and enhanced accessibility for individuals with communication challenges.
Papers
Rethinking the Evaluating Framework for Natural Language Understanding in AI Systems: Language Acquisition as a Core for Future Metrics
Patricio Vera, Pedro Moya, Lisa Barraza
Improving Language Model-Based Zero-Shot Text-to-Speech Synthesis with Multi-Scale Acoustic Prompts
Shun Lei, Yixuan Zhou, Liyang Chen, Dan Luo, Zhiyong Wu, Xixin Wu, Shiyin Kang, Tao Jiang, Yahui Zhou, Yuxing Han, Helen Meng
LLM4Jobs: Unsupervised occupation extraction and standardization leveraging Large Language Models
Nan Li, Bo Kang, Tijl De Bie
Proposition from the Perspective of Chinese Language: A Chinese Proposition Classification Evaluation Benchmark
Conghui Niu, Mengyang Hu, Lin Bo, Xiaoli He, Dong Yu, Pengyuan Liu