Efficient Hybrid
Hybrid approaches in various scientific fields aim to combine the strengths of different methods, leveraging complementary advantages to overcome individual limitations. Current research focuses on integrating deep learning models with classical techniques (e.g., physics-based models, HMMs), exploring novel architectures like hybrid transformers and employing ensemble methods to improve robustness and accuracy. These hybrid strategies are proving valuable across diverse applications, from accelerating large language model training and enhancing medical image analysis to improving autonomous robot navigation and enabling more efficient and accurate predictions in complex systems.
Papers
Causal Discovery from Time Series with Hybrids of Constraint-Based and Noise-Based Algorithms
Daria Bystrova, Charles K. Assaad, Julyan Arbel, Emilie Devijver, Eric Gaussier, Wilfried Thuiller
ClimSim: A large multi-scale dataset for hybrid physics-ML climate emulation
Sungduk Yu, Walter Hannah, Liran Peng, Jerry Lin, Mohamed Aziz Bhouri, Ritwik Gupta, Björn Lütjens, Justus Christopher Will, Gunnar Behrens, Julius Busecke, Nora Loose, Charles I Stern, Tom Beucler, Bryce Harrop, Benjamin R Hillman, Andrea Jenney, Savannah Ferretti, Nana Liu, Anima Anandkumar, Noah D Brenowitz, Veronika Eyring, Nicholas Geneva, Pierre Gentine, Stephan Mandt, Jaideep Pathak, Akshay Subramaniam, Carl Vondrick, Rose Yu, Laure Zanna, Tian Zheng, Ryan Abernathey, Fiaz Ahmed, David C Bader, Pierre Baldi, Elizabeth Barnes, Christopher Bretherton, Peter Caldwell, Wayne Chuang, Yilun Han, Yu Huang, Fernando Iglesias-Suarez, Sanket Jantre, Karthik Kashinath, Marat Khairoutdinov, Thorsten Kurth, Nicholas Lutsko, Po-Lun Ma, Griffin Mooers, J. David Neelin, David Randall, Sara Shamekh, Mark A Taylor, Nathan Urban, Janni Yuval, Guang Zhang, Michael Pritchard