Data Divergence

Data divergence, the measure of dissimilarity between datasets, is a crucial area of research impacting diverse fields from system health monitoring to generative model evaluation and federated learning. Current research focuses on developing robust and efficient methods for quantifying divergence, employing techniques like Wasserstein distance and novel architectures such as variational autoencoders, to analyze latent data representations and improve the interpretability of divergence metrics. These advancements are improving the accuracy of predictions in various applications, enabling more effective condition-based maintenance, enhancing the privacy of data analysis, and leading to more reliable and explainable generative models.

Papers