Back to Explorer
Research PaperResearchia:202601.29191[Statistics & ML > Statistics]

On Forgetting and Stability of Score-based Generative models

Stanislas Strasman

Abstract

Understanding the stability and long-time behavior of generative models is a fundamental problem in modern machine learning. This paper provides quantitative bounds on the sampling error of score-based generative models by leveraging stability and forgetting properties of the Markov chain associated with the reverse-time dynamics. Under weak assumptions, we provide the two structural properties to ensure the propagation of initialization and discretization errors of the backward process: a Lyapunov drift condition and a Doeblin-type minorization condition. A practical consequence is quantitative stability of the sampling procedure, as the reverse diffusion dynamics induces a contraction mechanism along the sampling trajectory. Our results clarify the role of stochastic dynamics in score-based models and provide a principled framework for analyzing propagation of errors in such approaches.


Source: arXiv:2601.21868v1 - http://arxiv.org/abs/2601.21868v1 PDF: https://arxiv.org/pdf/2601.21868v1 Original Link: http://arxiv.org/abs/2601.21868v1

Submission:1/29/2026
Comments:0 comments
Subjects:Statistics; Statistics & ML
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

On Forgetting and Stability of Score-based Generative models | Researchia