Back to Explorer
Research PaperResearchia:202604.06030[Data Science > Statistics]

Characterization of Gaussian Universality Breakdown in High-Dimensional Empirical Risk Minimization

Chiheb Yaakoubi

Abstract

We study high-dimensional convex empirical risk minimization (ERM) under general non-Gaussian data designs. By heuristically extending the Convex Gaussian Min-Max Theorem (CGMT) to non-Gaussian settings, we derive an asymptotic min-max characterization of key statistics, enabling approximation of the mean μθ^μ_{\hatθ} and covariance Cθ^C_{\hatθ} of the ERM estimator θ^\hatθ. Specifically, under a concentration assumption on the data matrix and standard regularity conditions on the loss and regularizer, we show that for a test covariate xx independent of the training data, the projection θ^x\hatθ^\top x approximately follows the convolution of the (generally non-Gaussian) distribution of μθ^xμ_{\hatθ}^\top x with an independent centered Gaussian variable of variance Tr(Cθ^E[xx])\text{Tr}(C_{\hatθ}\mathbb{E}[xx^\top]). This result clarifies the scope and limits of Gaussian universality for ERMs. Additionally, we prove that any C2\mathcal{C}^2 regularizer is asymptotically equivalent to a quadratic form determined solely by its Hessian at zero and gradient at μθ^μ_{\hatθ}. Numerical simulations across diverse losses and models are provided to validate our theoretical predictions and qualitative insights.


Source: arXiv:2604.03146v1 - http://arxiv.org/abs/2604.03146v1 PDF: https://arxiv.org/pdf/2604.03146v1 Original Link: http://arxiv.org/abs/2604.03146v1

Submission:4/6/2026
Comments:0 comments
Subjects:Statistics; Data Science
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!