Abstract
We consider the infinite-width limit of a fully connected deep neural network with general weights, and we prove quantitative general bounds on the2-Wasserstein distance between the network and its infinite-width Gaussian limit, under appropriate regularity assumptions on the activation function. Our main tool is a Lindeberg principle for Deep Neural Networks, which we use to successively replace the weights on each layer by Gaussian random variables.



