Back to Explorer
Research PaperResearchia:202602.26011[Data Science > Machine Learning]

Statistical Query Lower Bounds for Smoothed Agnostic Learning

Ilias Diakonikolas

Abstract

We study the complexity of smoothed agnostic learning, recently introduced by~\cite{CKKMS24}, in which the learner competes with the best classifier in a target class under slight Gaussian perturbations of the inputs. Specifically, we focus on the prototypical task of agnostically learning halfspaces under subgaussian distributions in the smoothed model. The best known upper bound for this problem relies on L1L_1-polynomial regression and has complexity dO~(1/σ2)log(1/ε)d^{\tilde{O}(1/σ^2) \log(1/ε)}, where σσ is the smoothing parameter and εε is the excess error. Our main result is a Statistical Query (SQ) lower bound providing formal evidence that this upper bound is close to best possible. In more detail, we show that (even for Gaussian marginals) any SQ algorithm for smoothed agnostic learning of halfspaces requires complexity dΩ(1/σ2+log(1/ε))d^{Ω(1/σ^{2}+\log(1/ε))}. This is the first non-trivial lower bound on the complexity of this task and nearly matches the known upper bound. Roughly speaking, we show that applying L1L_1-polynomial regression to a smoothed version of the function is essentially best possible. Our techniques involve finding a moment-matching hard distribution by way of linear programming duality. This dual program corresponds exactly to finding a low-degree approximating polynomial to the smoothed version of the target function (which turns out to be the same condition required for the L1L_1-polynomial regression to work). Our explicit SQ lower bound then comes from proving lower bounds on this approximation degree for the class of halfspaces.


Source: arXiv:2602.21191v1 - http://arxiv.org/abs/2602.21191v1 PDF: https://arxiv.org/pdf/2602.21191v1 Original Link: http://arxiv.org/abs/2602.21191v1

Submission:2/26/2026
Comments:0 comments
Subjects:Machine Learning; Data Science
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Statistical Query Lower Bounds for Smoothed Agnostic Learning | Researchia | Researchia