Generative Pre Trained Transformer
Generative Pre-trained Transformers (GPTs) are large language models designed to generate human-like text and increasingly, other data modalities. Current research focuses on improving GPT performance across diverse applications, including code generation, scientific modeling (e.g., seismic velocity modeling, cellular automata simulation), and healthcare (e.g., EHR generation, medical image analysis), often employing techniques like retrieval-augmented generation and in-context learning to enhance capabilities. The ability of GPTs to process and generate various data types makes them a powerful tool with significant implications for numerous fields, driving advancements in both scientific understanding and practical applications.
Papers
December 17, 2023
December 13, 2023
December 12, 2023
December 6, 2023
December 4, 2023
November 29, 2023
November 22, 2023
November 17, 2023
November 9, 2023
November 6, 2023
October 31, 2023
October 25, 2023
October 23, 2023
October 14, 2023
October 10, 2023
October 8, 2023
October 7, 2023
September 29, 2023
September 25, 2023
September 20, 2023