Universality in Deep Neural Networks: An approach via the Lindeberg exchange principle
Abstract
We consider the infinite-width limit of a fully connected deep neural network with general weights, and we prove quantitative general bounds on the $2$-Wasserstein distance between the network and its infinite-width Gaussian limit, under appropriate regularity assumptions on the activation function. Our main tool is a Lindeberg principle for Deep Neural Networks, which we use to successively replace the weights on each layer by Gaussian random variables. --- Source: arXiv:2605.02771v1 - http://a...
Description / Details
We consider the infinite-width limit of a fully connected deep neural network with general weights, and we prove quantitative general bounds on the -Wasserstein distance between the network and its infinite-width Gaussian limit, under appropriate regularity assumptions on the activation function. Our main tool is a Lindeberg principle for Deep Neural Networks, which we use to successively replace the weights on each layer by Gaussian random variables.
Source: arXiv:2605.02771v1 - http://arxiv.org/abs/2605.02771v1 PDF: https://arxiv.org/pdf/2605.02771v1 Original Link: http://arxiv.org/abs/2605.02771v1
Please sign in to join the discussion.
No comments yet. Be the first to share your thoughts!
May 5, 2026
Data Science
Machine Learning
0