Paper ID: 2403.08793
Neural Loss Function Evolution for Large-Scale Image Classifier Convolutional Neural Networks
Brandon Morgan, Dean Hougen
For classification, neural networks typically learn by minimizing cross-entropy, but are evaluated and compared using accuracy. This disparity suggests neural loss function search (NLFS), the search for a drop-in replacement loss function of cross-entropy for neural networks. We apply NLFS to image classifier convolutional neural networks. We propose a new search space for NLFS that encourages more diverse loss functions to be explored, and a surrogate function that accurately transfers to large-scale convolutional neural networks. We search the space using regularized evolution, a mutation-only aging genetic algorithm. After evolution and a proposed loss function elimination protocol, we transferred the final loss functions across multiple architectures, datasets, and image augmentation techniques to assess generalization. In the end, we discovered three new loss functions, called NeuroLoss1, NeuroLoss2, and NeuroLoss3 that were able to outperform cross-entropy in terms of a higher mean test accuracy as a simple drop-in replacement loss function across the majority of experiments.
Submitted: Jan 30, 2024