Covariance Matrix
Covariance matrices, representing the relationships between variables in a dataset, are central to numerous statistical and machine learning applications. Current research focuses on improving their estimation in high-dimensional settings, particularly addressing challenges like computational efficiency, robustness to noise and outliers, and handling data with complex structures (e.g., non-Gaussian distributions, manifold-valued data). This involves developing novel algorithms, often leveraging Riemannian geometry and techniques like shrinkage, regularization, and representation learning, to enhance accuracy and scalability. Improved covariance estimation has significant implications for diverse fields, including signal processing, biomedical data analysis, and financial modeling.
Papers
Combining Entropy and Matrix Nuclear Norm for Enhanced Evaluation of Language Models
James Vo
Automatic Classification of Sleep Stages from EEG Signals Using Riemannian Metrics and Transformer Networks
Mathieu Seraphim, Alexis Lechervy (GREYC), Florian Yger (MILES, LAMSADE, LITIS, App - LITIS), Luc Brun, Olivier Etard (COMETE, UNICAEN)