ExplorerData ScienceMachine Learning
Research PaperResearchia:202602.25046

Training-Free Generative Modeling via Kernelized Stochastic Interpolants

Florentin Coeurdoux

Abstract

We develop a kernel method for generative modeling within the stochastic interpolant framework, replacing neural network training with linear systems. The drift of the generative SDE is $\hat b_t(x) = \nablaφ(x)^\topη_t$, where $η_t\in\R^P$ solves a $P\times P$ system computable from data, with $P$ independent of the data dimension $d$. Since estimates are inexact, the diffusion coefficient $D_t$ affects sample quality; the optimal $D_t^$ from Girsanov diverges at $t=0$, but this poses no diffic...

Submitted: February 25, 2026Subjects: Machine Learning; Data Science

Description / Details

We develop a kernel method for generative modeling within the stochastic interpolant framework, replacing neural network training with linear systems. The drift of the generative SDE is b^t(x)=φ(x)ηt\hat b_t(x) = \nablaφ(x)^\topη_t, where ηtRPη_t\in\R^P solves a P×PP\times P system computable from data, with PP independent of the data dimension dd. Since estimates are inexact, the diffusion coefficient DtD_t affects sample quality; the optimal DtD_t^* from Girsanov diverges at t=0t=0, but this poses no difficulty and we develop an integrator that handles it seamlessly. The framework accommodates diverse feature maps -- scattering transforms, pretrained generative models etc. -- enabling training-free generation and model combination. We demonstrate the approach on financial time series, turbulence, and image generation.


Source: arXiv:2602.20070v1 - http://arxiv.org/abs/2602.20070v1 PDF: https://arxiv.org/pdf/2602.20070v1 Original Link: http://arxiv.org/abs/2602.20070v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Feb 25, 2026
Topic:
Data Science
Area:
Machine Learning
Comments:
0
Bookmark