Kernel Based Entropic Novelty
Kernel-based entropic novelty focuses on quantifying and detecting novel patterns or data points within a dataset, relative to a known reference. Current research explores this concept across diverse applications, employing methods like adversarial autoencoders, kernel density estimation, and information-theoretic measures to assess novelty in various data modalities, including scholarly publications and generative model outputs. These advancements improve the ability of machine learning models to identify unexpected events or generate truly novel outputs, with implications for fields ranging from scientific discovery to AI safety and robust system design.
Papers
September 25, 2024
August 25, 2024
April 6, 2024
February 27, 2024
February 5, 2024
January 8, 2024
December 9, 2023
February 28, 2023