Human Language
Human language research aims to understand how humans process, produce, and learn language, focusing on both its cognitive and computational aspects. Current research heavily utilizes large language models (LLMs) and vision-language models (VLMs), applying them to tasks like word sense disambiguation, cross-modal reasoning, and the analysis of language in diverse contexts such as online communities and medical images. These advancements are improving machine translation, text-to-speech synthesis, and other applications while also providing new tools for investigating fundamental questions about human cognition and language acquisition.
Papers
Perceptions of Linguistic Uncertainty by Language Models and Humans
Catarina G Belem, Markelle Kelly, Mark Steyvers, Sameer Singh, Padhraic Smyth
Text-to-Battery Recipe: A language modeling-based protocol for automatic battery recipe extraction and retrieval
Daeun Lee, Jaewoong Choi, Hiroshi Mizuseki, Byungju Lee