Unbounded Variance

Unbounded variance, a condition where data's spread lacks a finite second moment, poses significant challenges for statistical modeling and machine learning. Current research focuses on developing robust algorithms and estimators that can handle this scenario, particularly within Bayesian neural networks (employing α-stable processes and conditionally Gaussian representations) and stochastic optimization methods (like Stochastic Gradient Descent and Mirror Descent). Addressing unbounded variance is crucial for improving the reliability and applicability of these methods in various domains, including online control, federated learning, and regression analysis with heavy-tailed data, where traditional assumptions often fail.

Papers