High Dimensional
High-dimensional data analysis focuses on extracting meaningful information and building predictive models from datasets with numerous variables, often exceeding the number of observations. Current research emphasizes developing computationally efficient algorithms, such as stochastic gradient descent and its variants, and novel model architectures like graph neural networks and deep learning approaches tailored to handle the unique challenges posed by high dimensionality, including issues of sparsity and missing data. These advancements are crucial for addressing complex problems across diverse fields, including scientific modeling, robotics, and financial risk assessment, where high-dimensional data are increasingly prevalent.
Papers
Optimal SQ Lower Bounds for Robustly Learning Discrete Product Distributions and Ising Models
Ilias Diakonikolas, Daniel M. Kane, Yuxin Sun
Receding Horizon Inverse Reinforcement Learning
Yiqing Xu, Wei Gao, David Hsu
On Hypothesis Transfer Learning of Functional Linear Models
Haotian Lin, Matthew Reimherr
Critic Sequential Monte Carlo
Vasileios Lioutas, Jonathan Wilder Lavington, Justice Sefas, Matthew Niedoba, Yunpeng Liu, Berend Zwartsenberg, Setareh Dabiri, Frank Wood, Adam Scibior
Reinforcement Learning with a Terminator
Guy Tennenholtz, Nadav Merlis, Lior Shani, Shie Mannor, Uri Shalit, Gal Chechik, Assaf Hallak, Gal Dalal
Hilbert Curve Projection Distance for Distribution Comparison
Tao Li, Cheng Meng, Hongteng Xu, Jun Yu
A Continuous Time Framework for Discrete Denoising Models
Andrew Campbell, Joe Benton, Valentin De Bortoli, Tom Rainforth, George Deligiannidis, Arnaud Doucet
Precise Learning Curves and Higher-Order Scaling Limits for Dot Product Kernel Regression
Lechao Xiao, Hong Hu, Theodor Misiakiewicz, Yue M. Lu, Jeffrey Pennington