Training-Free Generative Modeling via Kernelized Stochastic Interpolants
Abstract
We develop a kernel method for generative modeling within the stochastic interpolant framework, replacing neural network training with linear systems. The drift of the generative SDE is , where solves a system computable from data, with independent of the data dimension . Since estimates are inexact, the diffusion coefficient affects sample quality; the optimal from Girsanov diverges at , but this poses no difficulty and we develop an integrator that handles it seamlessly. The framework accommodates diverse feature maps -- scattering transforms, pretrained generative models etc. -- enabling training-free generation and model combination. We demonstrate the approach on financial time series, turbulence, and image generation.
Source: arXiv:2602.20070v1 - http://arxiv.org/abs/2602.20070v1 PDF: https://arxiv.org/pdf/2602.20070v1 Original Link: http://arxiv.org/abs/2602.20070v1