Language Acquisition
Language acquisition research investigates how humans learn language, focusing on uncovering the underlying cognitive mechanisms and computational principles. Current research employs neural language models, such as transformers and recurrent neural networks, often trained on child-directed speech or simulated learning environments, to explore various aspects of language development, including syntactic and semantic bootstrapping, critical periods, and the role of multimodal input (vision and speech). These studies aim to bridge the gap between human and artificial language learning, informing both our understanding of human cognition and the development of more robust and human-like AI systems.
Papers
March 2, 2023
February 15, 2023
February 1, 2023
September 9, 2022
June 15, 2022
May 29, 2022
December 27, 2021