Paper ID: 2111.12798

Geometric Priors for Scientific Generative Models in Inertial Confinement Fusion

Ankita Shukla, Rushil Anirudh, Eugene Kur, Jayaraman J. Thiagarajan, Peer-Timo Bremer, Brian K. Spears, Tammy Ma, Pavan Turaga

In this paper, we develop a Wasserstein autoencoder (WAE) with a hyperspherical prior for multimodal data in the application of inertial confinement fusion. Unlike a typical hyperspherical generative model that requires computationally inefficient sampling from distributions like the von Mis Fisher, we sample from a normal distribution followed by a projection layer before the generator. Finally, to determine the validity of the generated samples, we exploit a known relationship between the modalities in the dataset as a scientific constraint, and study different properties of the proposed model.

Submitted: Nov 24, 2021