Linguistic Information

Linguistic information research focuses on understanding how language's structural and semantic properties are represented and processed by humans and computational models. Current research emphasizes how transformer-based models like BERT and GPT encode linguistic information in their embeddings, exploring techniques like probing and sparsification to analyze these representations and improve model interpretability and performance on tasks such as question answering and machine translation. This work is crucial for advancing natural language processing, improving the accuracy and efficiency of language technologies, and furthering our understanding of the cognitive mechanisms underlying human language comprehension and production.

Papers