Generative Pre Trained Transformer
Generative Pre-trained Transformers (GPTs) are large language models designed to generate human-like text and increasingly, other data modalities. Current research focuses on improving GPT performance across diverse applications, including code generation, scientific modeling (e.g., seismic velocity modeling, cellular automata simulation), and healthcare (e.g., EHR generation, medical image analysis), often employing techniques like retrieval-augmented generation and in-context learning to enhance capabilities. The ability of GPTs to process and generate various data types makes them a powerful tool with significant implications for numerous fields, driving advancements in both scientific understanding and practical applications.
Papers
October 19, 2022
September 22, 2022
March 28, 2022
February 27, 2022
January 26, 2022
November 16, 2021