Feature Selection
Feature selection aims to identify the most relevant subset of features from a larger dataset, improving model performance, interpretability, and efficiency. Current research emphasizes developing novel algorithms, including those based on neural networks (e.g., RelChaNet), genetic algorithms, and large language models (LLMs), to select features effectively and efficiently, often incorporating techniques like causal inference and uncertainty quantification. These advancements are crucial for various applications, such as medical diagnosis, financial prediction, and recommender systems, where reducing dimensionality and improving model explainability are paramount. The field is also actively exploring new evaluation metrics and addressing challenges like fairness and privacy in feature selection.
Papers
Modified Genetic Algorithm for Feature Selection and Hyper Parameter Optimization: Case of XGBoost in Spam Prediction
Nazeeh Ghatasheh, Ismail Altaharwa, Khaled Aldebei
Modeling the Telemarketing Process using Genetic Algorithms and Extreme Boosting: Feature Selection and Cost-Sensitive Analytical Approach
Nazeeh Ghatasheh, Ismail Altaharwa, Khaled Aldebei
Feature Selection and Hyperparameter Fine-tuning in Artificial Neural Networks for Wood Quality Classification
Mateus Roder, Leandro Aparecido Passos, João Paulo Papa, André Luis Debiaso Rossi
An Exploratory Study on Simulated Annealing for Feature Selection in Learning-to-Rank
Mohd. Sayemul Haque, Md. Fahim, Muhammad Ibrahim