High Dimension
High-dimensionality, the challenge of analyzing data with numerous features, is a pervasive problem across numerous scientific fields, driving research into efficient algorithms and models for data representation and analysis. Current efforts focus on developing methods that mitigate the "curse of dimensionality," including advancements in neural networks (e.g., sparse networks, diffusion models, and implicit neural networks), dimensionality reduction techniques (e.g., PCA, tensor train decomposition), and robust optimization algorithms (e.g., stochastic gradient descent variants). Overcoming these challenges is crucial for progress in diverse areas, from machine learning and data analysis to scientific computing and causal inference, enabling more accurate and efficient processing of complex, high-dimensional data.
Papers
Hutchinson Trace Estimation for High-Dimensional and High-Order Physics-Informed Neural Networks
Zheyuan Hu, Zekun Shi, George Em Karniadakis, Kenji Kawaguchi
Attacking Byzantine Robust Aggregation in High Dimensions
Sarthak Choudhary, Aashish Kolluri, Prateek Saxena
How to Overcome Curse-of-Dimensionality for Out-of-Distribution Detection?
Soumya Suvra Ghosal, Yiyou Sun, Yixuan Li