Latent Sequence
Latent sequence analysis focuses on extracting meaningful, hidden patterns from sequential data, aiming to understand underlying processes or generate new sequences with desired properties. Current research heavily utilizes recurrent neural networks, transformers, and variational autoencoders (VAEs), often in combination with diffusion models, to learn and model these latent representations, with applications ranging from cognitive modeling and data privacy analysis to music information retrieval and 3D motion generation. This field is significant for its ability to uncover hidden structure in complex data, leading to improved understanding of diverse phenomena and enabling the creation of more sophisticated AI systems.
Papers
Latent Variable Sequence Identification for Cognitive Models with Neural Bayes Estimation
Ti-Fen Pan, Jing-Jing Li, Bill Thompson, Anne Collins
Uncovering Latent Memories: Assessing Data Leakage and Memorization Patterns in Frontier AI Models
Sunny Duan, Mikail Khona, Abhiram Iyer, Rylan Schaeffer, Ila R Fiete