Adaptive Boosting
Adaptive boosting (AdaBoost) is an ensemble learning method that combines multiple weak classifiers to create a strong, accurate predictor by iteratively weighting training samples based on their classification difficulty. Current research focuses on enhancing AdaBoost's performance and addressing limitations, including developing variations like CatBoost and XGBoost, improving fairness in classification, and adapting it for dynamic data streams and large language model integration. These advancements are significant for various applications, from poverty prediction and healthcare diagnostics to text classification and improving the interpretability of complex models.
Papers
June 1, 2024
May 28, 2024
February 12, 2024
January 6, 2024
December 17, 2023
July 25, 2023
February 14, 2023
October 18, 2022
May 22, 2022
January 2, 2022