ExplorerMathematicsMathematics
Research PaperResearchia:202604.28032

Theoretical guarantees for stochastic gradient sampling methods via Gaussian convolution inequalities

Daniel Paulin

Abstract

We derive first-order (in the stepsize) bounds on the bias in Wasserstein distances of the invariant measure of stochastic gradient kinetic Langevin dynamics with minimal assumptions on the stochastic gradient noise. These bounds sharpen existing non-asymptotic guarantees for stochastic-gradient MCMC methods and provide a quantitative resolution of a previously open problem on invariant measure accuracy. The main technical ingredients are new Gaussian convolution inequalities controlling the Was...

Submitted: April 28, 2026Subjects: Mathematics; Mathematics

Description / Details

We derive first-order (in the stepsize) bounds on the bias in Wasserstein distances of the invariant measure of stochastic gradient kinetic Langevin dynamics with minimal assumptions on the stochastic gradient noise. These bounds sharpen existing non-asymptotic guarantees for stochastic-gradient MCMC methods and provide a quantitative resolution of a previously open problem on invariant measure accuracy. The main technical ingredients are new Gaussian convolution inequalities controlling the Wasserstein-pp distance between a Gaussian convolved with a mean-zero perturbation and the Gaussian itself. We anticipate that these inequalities will be of independent interest beyond the present application.


Source: arXiv:2604.24632v1 - http://arxiv.org/abs/2604.24632v1 PDF: https://arxiv.org/pdf/2604.24632v1 Original Link: http://arxiv.org/abs/2604.24632v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Apr 28, 2026
Topic:
Mathematics
Area:
Mathematics
Comments:
0
Bookmark