Language Acquisition

Language acquisition research investigates how humans learn language, focusing on uncovering the underlying cognitive mechanisms and computational principles. Current research employs neural language models, such as transformers and recurrent neural networks, often trained on child-directed speech or simulated learning environments, to explore various aspects of language development, including syntactic and semantic bootstrapping, critical periods, and the role of multimodal input (vision and speech). These studies aim to bridge the gap between human and artificial language learning, informing both our understanding of human cognition and the development of more robust and human-like AI systems.

Papers