Language Processing

Language processing research aims to understand how humans and machines process language, focusing on the computational mechanisms underlying comprehension and production. Current research utilizes transformer-based models like BERT and GPT, along with recurrent neural networks like LSTMs, to analyze various aspects of language, including syntactic structures, semantic relationships, and pragmatic inference, often comparing model behavior to human brain activity (e.g., EEG, fMRI) to assess cognitive plausibility. These investigations are crucial for advancing both artificial intelligence and our understanding of human cognition, potentially leading to improved language technologies and deeper insights into the neural basis of language.

Papers