Foundation Model
Foundation models are large, pre-trained AI models designed to generalize across diverse tasks and datasets, offering a powerful alternative to task-specific models. Current research emphasizes adapting these models to various domains, including healthcare (e.g., medical image analysis, EEG interpretation), scientific applications (e.g., genomics, weather forecasting), and robotics, often employing architectures like transformers and mixtures of experts with innovative gating functions. This approach promises to improve efficiency and accuracy in numerous fields by leveraging the knowledge embedded within these powerful models, streamlining data analysis and enabling new applications previously hindered by data scarcity or computational limitations.
Papers
Towards Training A Chinese Large Language Model for Anesthesiology
Zhonghai Wang, Jie Jiang, Yibing Zhan, Bohao Zhou, Yanhong Li, Chong Zhang, Liang Ding, Hua Jin, Jun Peng, Xu Lin, Weifeng Liu
Caduceus: Bi-Directional Equivariant Long-Range DNA Sequence Modeling
Yair Schiff, Chia-Hsiang Kao, Aaron Gokaslan, Tri Dao, Albert Gu, Volodymyr Kuleshov
Birbal: An efficient 7B instruct-model fine-tuned with curated datasets
Ashvini Kumar Jindal, Pawan Kumar Rajpoot, Ankur Parikh
An Improved Traditional Chinese Evaluation Suite for Foundation Model
Zhi-Rui Tam, Ya-Ting Pai, Yen-Wei Lee, Sega Cheng, Hong-Han Shuai
Differentially Private Synthetic Data via Foundation Model APIs 2: Text
Chulin Xie, Zinan Lin, Arturs Backurs, Sivakanth Gopi, Da Yu, Huseyin A Inan, Harsha Nori, Haotian Jiang, Huishuai Zhang, Yin Tat Lee, Bo Li, Sergey Yekhanin
Position Paper: Agent AI Towards a Holistic Intelligence
Qiuyuan Huang, Naoki Wake, Bidipta Sarkar, Zane Durante, Ran Gong, Rohan Taori, Yusuke Noda, Demetri Terzopoulos, Noboru Kuno, Ade Famoti, Ashley Llorens, John Langford, Hoi Vo, Li Fei-Fei, Katsu Ikeuchi, Jianfeng Gao
Learning to Deliver: a Foundation Model for the Montreal Capacitated Vehicle Routing Problem
Samuel J. K. Chin, Matthias Winkenbach, Akash Srivastava
Self-Supervised Learning in Electron Microscopy: Towards a Foundation Model for Advanced Image Analysis
Bashir Kazimi, Karina Ruzaeva, Stefan Sandfeld
OpenMEDLab: An Open-source Platform for Multi-modality Foundation Models in Medicine
Xiaosong Wang, Xiaofan Zhang, Guotai Wang, Junjun He, Zhongyu Li, Wentao Zhu, Yi Guo, Qi Dou, Xiaoxiao Li, Dequan Wang, Liang Hong, Qicheng Lao, Tong Ruan, Yukun Zhou, Yixue Li, Jie Zhao, Kang Li, Xin Sun, Lifeng Zhu, Shaoting Zhang
On the Societal Impact of Open Foundation Models
Sayash Kapoor, Rishi Bommasani, Kevin Klyman, Shayne Longpre, Ashwin Ramaswami, Peter Cihon, Aspen Hopkins, Kevin Bankston, Stella Biderman, Miranda Bogen, Rumman Chowdhury, Alex Engler, Peter Henderson, Yacine Jernite, Seth Lazar, Stefano Maffulli, Alondra Nelson, Joelle Pineau, Aviya Skowron, Dawn Song, Victor Storchan, Daniel Zhang, Daniel E. Ho, Percy Liang, Arvind Narayanan
Securing Reliability: A Brief Overview on Enhancing In-Context Learning for Foundation Models
Yunpeng Huang, Yaonan Gu, Jingwei Xu, Zhihong Zhu, Zhaorun Chen, Xiaoxing Ma
Data-Efficient Learning via Clustering-Based Sensitivity Sampling: Foundation Models and Beyond
Kyriakos Axiotis, Vincent Cohen-Addad, Monika Henzinger, Sammy Jerome, Vahab Mirrokni, David Saulpic, David Woodruff, Michael Wunder
Feature Re-Embedding: Towards Foundation Model-Level Performance in Computational Pathology
Wenhao Tang, Fengtao Zhou, Sheng Huang, Xiang Zhu, Yi Zhang, Bo Liu
Asymmetry in Low-Rank Adapters of Foundation Models
Jiacheng Zhu, Kristjan Greenewald, Kimia Nadjahi, Haitz Sáez de Ocáriz Borde, Rickard Brüel Gabrielsson, Leshem Choshen, Marzyeh Ghassemi, Mikhail Yurochkin, Justin Solomon
An Integrated Data Processing Framework for Pretraining Foundation Models
Yiding Sun, Feng Wang, Yutao Zhu, Wayne Xin Zhao, Jiaxin Mao