The Theory and Practice of Highly Scalable Gaussian Process Regression with Nearest Neighbours
Abstract
Gaussian process () regression is a widely used non-parametric modeling tool, but its cubic complexity in the training size limits its use on massive data sets. A practical remedy is to predict using only the nearest neighbours of each test point, as in Nearest Neighbour Gaussian Process () regression for geospatial problems and the related scalable method for more general machine-learning applications. Despite their strong empirical performance, the large- theory of remains incomplete. We develop a theoretical framework for and regression. Under mild regularity assumptions, we derive almost sure pointwise limits for three key predictive criteria: mean squared error (), calibration coefficient (), and negative log-likelihood (). We then study the -risk, prove universal consistency, and show that the risk attains Stone's minimax rate , where and capture regularity of the regression problem. We also prove uniform convergence of over compact hyper-parameter sets and show that its derivatives with respect to lengthscale, kernel scale, and noise variance vanish asymptotically, with explicit rates. This explains the observed robustness of to hyper-parameter tuning. These results provide a rigorous statistical foundation for as a highly scalable and principled alternative to full models.
Source: arXiv:2604.07267v1 - http://arxiv.org/abs/2604.07267v1 PDF: https://arxiv.org/pdf/2604.07267v1 Original Link: http://arxiv.org/abs/2604.07267v1