Language Acquisition
Language acquisition research investigates how humans learn language, focusing on uncovering the underlying cognitive mechanisms and computational principles. Current research employs neural language models, such as transformers and recurrent neural networks, often trained on child-directed speech or simulated learning environments, to explore various aspects of language development, including syntactic and semantic bootstrapping, critical periods, and the role of multimodal input (vision and speech). These studies aim to bridge the gap between human and artificial language learning, informing both our understanding of human cognition and the development of more robust and human-like AI systems.
Papers
October 8, 2024
August 18, 2024
August 9, 2024
July 27, 2024
June 17, 2024
March 21, 2024
February 26, 2024
February 18, 2024
February 12, 2024
December 15, 2023
November 22, 2023
October 31, 2023
September 26, 2023
September 21, 2023
June 27, 2023
June 6, 2023
June 5, 2023
May 31, 2023