Low Cost Ensemble
Low-cost ensemble methods aim to achieve the accuracy improvements of traditional ensemble learning while drastically reducing computational costs associated with training multiple independent neural networks. Current research focuses on creating diverse ensembles from a single parent network through techniques like multi-branch transformations, network fission (creating multiple exits), subnetwork sampling and pruning, and subsequent fine-tuning. These approaches offer significant improvements in training speed, memory efficiency, and sometimes even accuracy compared to training multiple independent models, making them attractive for resource-constrained applications and improving the practicality of ensemble methods for broader use.
Papers
August 5, 2024
March 15, 2024
November 23, 2023
February 12, 2023