Generative Pre Trained Transformer
Generative Pre-trained Transformers (GPTs) are large language models designed to generate human-like text and increasingly, other data modalities. Current research focuses on improving GPT performance across diverse applications, including code generation, scientific modeling (e.g., seismic velocity modeling, cellular automata simulation), and healthcare (e.g., EHR generation, medical image analysis), often employing techniques like retrieval-augmented generation and in-context learning to enhance capabilities. The ability of GPTs to process and generate various data types makes them a powerful tool with significant implications for numerous fields, driving advancements in both scientific understanding and practical applications.
Papers
May 22, 2023
May 11, 2023
May 9, 2023
May 8, 2023
May 4, 2023
April 20, 2023
April 19, 2023
April 18, 2023
April 7, 2023
March 30, 2023
March 17, 2023
March 16, 2023
March 9, 2023
February 18, 2023
February 17, 2023
January 15, 2023
January 2, 2023
December 26, 2022
October 31, 2022