A Note on Non-Negative $L_1$-Approximating Polynomials
Abstract
$L_1$-Approximating polynomials, i.e., polynomials that approximate indicator functions in $L_1$-norm under certain distributions, are widely used in computational learning theory. We study the existence of \textit{non-negative} $L_1$-approximating polynomials with respect to Gaussian distributions. This is a stronger requirement than $L_1$-approximation but weaker than sandwiching polynomials (which themselves have many applications). These non-negative approximating polynomials have recently f...
Description / Details
-Approximating polynomials, i.e., polynomials that approximate indicator functions in -norm under certain distributions, are widely used in computational learning theory. We study the existence of \textit{non-negative} -approximating polynomials with respect to Gaussian distributions. This is a stronger requirement than -approximation but weaker than sandwiching polynomials (which themselves have many applications). These non-negative approximating polynomials have recently found uses in smoothed learning from positive-only examples. In this short note, we prove that every class of sets with Gaussian surface area (GSA) at most under the standard Gaussian admits degree- non-negative polynomials that -approximate its indicator functions in -norm, for . Equivalently, finite GSA implies -approximation with the stronger pointwise guarantee that the approximating polynomial has range contained in . Up to a constant-factor, this matches the degree of the best currently known Gaussian -approximation degree bound without the non-negativity constraint.
Source: arXiv:2605.08072v1 - http://arxiv.org/abs/2605.08072v1 PDF: https://arxiv.org/pdf/2605.08072v1 Original Link: http://arxiv.org/abs/2605.08072v1
Please sign in to join the discussion.
No comments yet. Be the first to share your thoughts!
May 11, 2026
Data Science
Statistics
0