High Dimension
High-dimensionality, the challenge of analyzing data with numerous features, is a pervasive problem across numerous scientific fields, driving research into efficient algorithms and models for data representation and analysis. Current efforts focus on developing methods that mitigate the "curse of dimensionality," including advancements in neural networks (e.g., sparse networks, diffusion models, and implicit neural networks), dimensionality reduction techniques (e.g., PCA, tensor train decomposition), and robust optimization algorithms (e.g., stochastic gradient descent variants). Overcoming these challenges is crucial for progress in diverse areas, from machine learning and data analysis to scientific computing and causal inference, enabling more accurate and efficient processing of complex, high-dimensional data.
Papers
To Clip or not to Clip: the Dynamics of SGD with Gradient Clipping in High-Dimensions
Noah Marshall, Ke Liang Xiao, Atish Agarwala, Elliot Paquette
Score-fPINN: Fractional Score-Based Physics-Informed Neural Networks for High-Dimensional Fokker-Planck-Levy Equations
Zheyuan Hu, Zhongqiang Zhang, George Em Karniadakis, Kenji Kawaguchi