Theoretical guarantees for stochastic gradient sampling methods via Gaussian convolution inequalities
Abstract
We derive first-order (in the stepsize) bounds on the bias in Wasserstein distances of the invariant measure of stochastic gradient kinetic Langevin dynamics with minimal assumptions on the stochastic gradient noise. These bounds sharpen existing non-asymptotic guarantees for stochastic-gradient MCMC methods and provide a quantitative resolution of a previously open problem on invariant measure accuracy. The main technical ingredients are new Gaussian convolution inequalities controlling the Was...
Description / Details
We derive first-order (in the stepsize) bounds on the bias in Wasserstein distances of the invariant measure of stochastic gradient kinetic Langevin dynamics with minimal assumptions on the stochastic gradient noise. These bounds sharpen existing non-asymptotic guarantees for stochastic-gradient MCMC methods and provide a quantitative resolution of a previously open problem on invariant measure accuracy. The main technical ingredients are new Gaussian convolution inequalities controlling the Wasserstein- distance between a Gaussian convolved with a mean-zero perturbation and the Gaussian itself. We anticipate that these inequalities will be of independent interest beyond the present application.
Source: arXiv:2604.24632v1 - http://arxiv.org/abs/2604.24632v1 PDF: https://arxiv.org/pdf/2604.24632v1 Original Link: http://arxiv.org/abs/2604.24632v1
Please sign in to join the discussion.
No comments yet. Be the first to share your thoughts!
Apr 28, 2026
Mathematics
Mathematics
0