Back to Explorer
Research PaperResearchia:202602.25046[Data Science > Machine Learning]

Training-Free Generative Modeling via Kernelized Stochastic Interpolants

Florentin Coeurdoux

Abstract

We develop a kernel method for generative modeling within the stochastic interpolant framework, replacing neural network training with linear systems. The drift of the generative SDE is b^t(x)=φ(x)ηt\hat b_t(x) = \nablaφ(x)^\topη_t, where ηtRPη_t\in\R^P solves a P×PP\times P system computable from data, with PP independent of the data dimension dd. Since estimates are inexact, the diffusion coefficient DtD_t affects sample quality; the optimal DtD_t^* from Girsanov diverges at t=0t=0, but this poses no difficulty and we develop an integrator that handles it seamlessly. The framework accommodates diverse feature maps -- scattering transforms, pretrained generative models etc. -- enabling training-free generation and model combination. We demonstrate the approach on financial time series, turbulence, and image generation.


Source: arXiv:2602.20070v1 - http://arxiv.org/abs/2602.20070v1 PDF: https://arxiv.org/pdf/2602.20070v1 Original Link: http://arxiv.org/abs/2602.20070v1

Submission:2/25/2026
Comments:0 comments
Subjects:Machine Learning; Data Science
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!