Probabilistic Generative Model
Probabilistic generative models aim to learn the underlying probability distribution of data to generate new, similar data samples. Current research emphasizes improving the efficiency and accuracy of these models, focusing on architectures like variational autoencoders, diffusion models, and generative flow networks, as well as advanced inference techniques such as variational inference and Markov Chain Monte Carlo methods. These models find applications across diverse fields, including natural language processing, image synthesis, drug discovery, and social science modeling, offering powerful tools for data analysis and generation in various domains. Challenges remain in addressing issues like distribution mismatch in datasets and efficiently handling high-dimensional data.
Papers
Symbol Emergence as Inter-personal Categorization with Head-to-head Latent Word
Kazuma Furukawa, Akira Taniguchi, Yoshinobu Hagiwara, Tadahiro Taniguchi
Emergent Communication through Metropolis-Hastings Naming Game with Deep Generative Models
Tadahiro Taniguchi, Yuto Yoshida, Akira Taniguchi, Yoshinobu Hagiwara