Foundation Model
Foundation models are large, pre-trained AI models designed to generalize across diverse tasks and datasets, offering a powerful alternative to task-specific models. Current research emphasizes adapting these models to various domains, including healthcare (e.g., medical image analysis, EEG interpretation), scientific applications (e.g., genomics, weather forecasting), and robotics, often employing architectures like transformers and mixtures of experts with innovative gating functions. This approach promises to improve efficiency and accuracy in numerous fields by leveraging the knowledge embedded within these powerful models, streamlining data analysis and enabling new applications previously hindered by data scarcity or computational limitations.
Papers
Integrating kNN with Foundation Models for Adaptable and Privacy-Aware Image Classification
Sebastian Doerrich, Tobias Archut, Francesco Di Salvo, Christian Ledig
ChartX & ChartVLM: A Versatile Benchmark and Foundation Model for Complicated Chart Reasoning
Renqiu Xia, Bo Zhang, Hancheng Ye, Xiangchao Yan, Qi Liu, Hongbin Zhou, Zijun Chen, Min Dou, Botian Shi, Junchi Yan, Yu Qiao
Rewards-in-Context: Multi-objective Alignment of Foundation Models with Dynamic Preference Adjustment
Rui Yang, Xiaoman Pan, Feng Luo, Shuang Qiu, Han Zhong, Dong Yu, Jianshu Chen
BrainWave: A Brain Signal Foundation Model for Clinical Applications
Zhizhang Yuan, Fanqi Shen, Meng Li, Yuguo Yu, Chenhao Tan, Yang Yang
Embracing the black box: Heading towards foundation models for causal discovery from time series data
Gideon Stein, Maha Shadaydeh, Joachim Denzler
Learning Interpretable Concepts: Unifying Causal Representation Learning and Foundation Models
Goutham Rajendran, Simon Buchholz, Bryon Aragam, Bernhard Schölkopf, Pradeep Ravikumar
Advancing Human Action Recognition with Foundation Models trained on Unlabeled Public Videos
Yang Qian, Yinan Sun, Ali Kargarandehkordi, Onur Cezmi Mutlu, Saimourya Surabhi, Pingyi Chen, Zain Jabbar, Dennis Paul Wall, Peter Washington
DNABERT-S: Pioneering Species Differentiation with Species-Aware DNA Embeddings
Zhihan Zhou, Weimin Wu, Harrison Ho, Jiayi Wang, Lizhen Shi, Ramana V Davuluri, Zhong Wang, Han Liu
Online Foundation Model Selection in Robotics
Po-han Li, Oyku Selin Toprak, Aditya Narayanan, Ufuk Topcu, Sandeep Chinchali
Efficient and Scalable Fine-Tune of Language Models for Genome Understanding
Huixin Zhan, Ying Nian Wu, Zijun Zhang
Towards a Foundation Model for Brain Age Prediction using coVariance Neural Networks
Saurabh Sihag, Gonzalo Mateos, Alejandro Ribeiro
Only the Curve Shape Matters: Training Foundation Models for Zero-Shot Multivariate Time Series Forecasting through Next Curve Shape Prediction
Cheng Feng, Long Huang, Denis Krompass
An Interactive Agent Foundation Model
Zane Durante, Bidipta Sarkar, Ran Gong, Rohan Taori, Yusuke Noda, Paul Tang, Ehsan Adeli, Shrinidhi Kowshika Lakshmikanth, Kevin Schulman, Arnold Milstein, Demetri Terzopoulos, Ade Famoti, Noboru Kuno, Ashley Llorens, Hoi Vo, Katsu Ikeuchi, Li Fei-Fei, Jianfeng Gao, Naoki Wake, Qiuyuan Huang
Real-World Robot Applications of Foundation Models: A Review
Kento Kawaharazuka, Tatsuya Matsushima, Andrew Gambardella, Jiaxian Guo, Chris Paxton, Andy Zeng
Buffer Overflow in Mixture of Experts
Jamie Hayes, Ilia Shumailov, Itay Yona