Paper ID: 2411.05937
The effect of different feature selection methods on models created with XGBoost
Jorge Neyra, Vishal B. Siramshetty, Huthaifa I. Ashqar
This study examines the effect that different feature selection methods have on models created with XGBoost, a popular machine learning algorithm with superb regularization methods. It shows that three different ways for reducing the dimensionality of features produces no statistically significant change in the prediction accuracy of the model. This suggests that the traditional idea of removing the noisy training data to make sure models do not overfit may not apply to XGBoost. But it may still be viable in order to reduce computational complexity.
Submitted: Nov 8, 2024