Paper ID: 2410.10216

Neural Quasiprobabilistic Likelihood Ratio Estimation with Negatively Weighted Data

Matthew Drnevich, Stephen Jiggins, Judith Katzy, Kyle Cranmer

Motivated by real-world situations found in high energy particle physics, we consider a generalisation of the likelihood-ratio estimation task to a quasiprobabilistic setting where probability densities can be negative. By extension, this framing also applies to importance sampling in a setting where the importance weights can be negative. The presence of negative densities and negative weights, pose an array of challenges to traditional neural likelihood ratio estimation methods. We address these challenges by introducing a novel loss function. In addition, we introduce a new model architecture based on the decomposition of a likelihood ratio using signed mixture models, providing a second strategy for overcoming these challenges. Finally, we demonstrate our approach on a pedagogical example and a real-world example from particle physics.

Submitted: Oct 14, 2024