Paper ID: 2305.13472
A comprehensive theoretical framework for the optimization of neural networks classification performance with respect to weighted metrics
Francesco Marchetti, Sabrina Guastavino, Cristina Campi, Federico Benvenuto, Michele Piana
In many contexts, customized and weighted classification scores are designed in order to evaluate the goodness of the predictions carried out by neural networks. However, there exists a discrepancy between the maximization of such scores and the minimization of the loss function in the training phase. In this paper, we provide a complete theoretical setting that formalizes weighted classification metrics and then allows the construction of losses that drive the model to optimize these metrics of interest. After a detailed theoretical analysis, we show that our framework includes as particular instances well-established approaches such as classical cost-sensitive learning, weighted cross entropy loss functions and value-weighted skill scores.
Submitted: May 22, 2023