Diverse Ensemble
Diverse ensemble methods combine predictions from multiple models to improve accuracy, robustness, and uncertainty quantification in various machine learning tasks. Current research focuses on developing efficient ensemble techniques, including those based on AdaBoost, weight averaging, and contrastive learning, and applying them to diverse areas such as natural language processing, image classification, and time series forecasting. This approach is proving valuable for addressing challenges like limited training data, adversarial attacks, and concept drift, leading to improved performance and reliability in real-world applications.
Papers
December 30, 2024
December 29, 2024
December 25, 2024
December 14, 2024
December 13, 2024
December 12, 2024
December 11, 2024
December 8, 2024
December 5, 2024
December 4, 2024
November 30, 2024
November 20, 2024
November 8, 2024
November 6, 2024
November 5, 2024
Blending Ensemble for Classification with Genetic-algorithm generated Alpha factors and Sentiments (GAS)
Quechen Yang
Graph-DPEP: Decomposed Plug and Ensemble Play for Few-Shot Document Relation Extraction with Graph-of-Thoughts Reasoning
Tao Zhang, Ning Yan, Masood Mortazavi, Hoang H. Nguyen, Zhongfen Deng, Philip S. Yu
November 3, 2024
November 1, 2024
October 25, 2024