Gradient Boosted Decision Tree
Gradient Boosted Decision Trees (GBDTs) are ensemble learning methods that combine multiple decision trees to achieve high predictive accuracy, particularly on tabular data. Current research focuses on improving GBDT performance through enhanced algorithms (like XGBoost, CatBoost, and LightGBM), addressing challenges such as label noise and developing robust, efficient, and privacy-preserving variants for various applications, including federated learning. GBDTs' superior performance on tabular data, coupled with their relatively low computational cost and interpretability, makes them a powerful tool across diverse fields, from medical diagnosis and market value prediction to uplift modeling and time series analysis.
Papers
October 4, 2024
September 25, 2024
September 13, 2024
August 12, 2024
July 5, 2024
December 5, 2023
November 8, 2023
September 21, 2023
May 22, 2023
May 18, 2023
April 26, 2023
November 30, 2022
November 21, 2022
October 6, 2022
May 14, 2022