Language Processing
Language processing research aims to understand how humans and machines process language, focusing on the computational mechanisms underlying comprehension and production. Current research utilizes transformer-based models like BERT and GPT, along with recurrent neural networks like LSTMs, to analyze various aspects of language, including syntactic structures, semantic relationships, and pragmatic inference, often comparing model behavior to human brain activity (e.g., EEG, fMRI) to assess cognitive plausibility. These investigations are crucial for advancing both artificial intelligence and our understanding of human cognition, potentially leading to improved language technologies and deeper insights into the neural basis of language.
Papers
January 12, 2023
December 23, 2022
December 19, 2022
December 16, 2022
October 24, 2022
October 19, 2022
October 13, 2022
June 29, 2022
April 28, 2022
March 7, 2022