Natural Language
Natural language processing (NLP) focuses on enabling computers to understand, interpret, and generate human language. Current research heavily utilizes large language models (LLMs), such as BERT and others, to tackle diverse tasks including text-to-SQL translation, semantic analysis of images, and even controlling robots via natural language commands. The field's impact spans various sectors, from improving search engines and e-commerce platforms to advancing healthcare diagnostics and facilitating more efficient scientific research through automated literature analysis and data extraction.
Papers
A Comparative Study of Transformer-based Neural Text Representation Techniques on Bug Triaging
Atish Kumar Dipongkor, Kevin Moran
Let Models Speak Ciphers: Multiagent Debate through Embeddings
Chau Pham, Boyi Liu, Yingxiang Yang, Zhengyu Chen, Tianyi Liu, Jianbo Yuan, Bryan A. Plummer, Zhaoran Wang, Hongxia Yang
Evolution of Natural Language Processing Technology: Not Just Language Processing Towards General Purpose AI
Masahiro Yamamoto
AXNav: Replaying Accessibility Tests from Natural Language
Maryam Taeb, Amanda Swearngin, Eldon Schoop, Ruijia Cheng, Yue Jiang, Jeffrey Nichols
Can Large Language Models Transform Natural Language Intent into Formal Method Postconditions?
Madeline Endres, Sarah Fakhoury, Saikat Chakraborty, Shuvendu K. Lahiri