Language Processing
Language processing research aims to understand how humans and machines process language, focusing on the computational mechanisms underlying comprehension and production. Current research utilizes transformer-based models like BERT and GPT, along with recurrent neural networks like LSTMs, to analyze various aspects of language, including syntactic structures, semantic relationships, and pragmatic inference, often comparing model behavior to human brain activity (e.g., EEG, fMRI) to assess cognitive plausibility. These investigations are crucial for advancing both artificial intelligence and our understanding of human cognition, potentially leading to improved language technologies and deeper insights into the neural basis of language.
Papers
November 17, 2024
November 4, 2024
October 15, 2024
October 11, 2024
October 10, 2024
September 17, 2024
September 10, 2024
September 7, 2024
August 13, 2024
August 8, 2024
August 6, 2024
July 5, 2024
June 28, 2024
June 17, 2024
June 2, 2024
May 24, 2024
May 3, 2024
March 25, 2024
March 20, 2024