Gradient Boosted Decision Tree

Gradient Boosted Decision Trees (GBDTs) are ensemble learning methods that combine multiple decision trees to achieve high predictive accuracy, particularly on tabular data. Current research focuses on improving GBDT performance through enhanced algorithms (like XGBoost, CatBoost, and LightGBM), addressing challenges such as label noise and developing robust, efficient, and privacy-preserving variants for various applications, including federated learning. GBDTs' superior performance on tabular data, coupled with their relatively low computational cost and interpretability, makes them a powerful tool across diverse fields, from medical diagnosis and market value prediction to uplift modeling and time series analysis.

Papers