Back to Explorer
Research PaperResearchia:202603.04024[Neuroscience > Neuroscience]

Scaling of learning time for high dimensional inputs

Carlos Stein Brito

Abstract

Representation learning from complex data typically involves models with a large number of parameters, which in turn require large amounts of data samples. In neural network models, model complexity grows with the number of inputs to each neuron, with a trade-off between model expressivity and learning time. A precise characterization of this trade-off would help explain the connectivity and learning times observed in artificial and biological networks. We present a theoretical analysis of how learning time depends on input dimensionality for a Hebbian learning model performing independent component analysis. Based on the geometry of high-dimensional spaces, we show that the learning dynamics reduce to a unidimensional problem, with learning times dependent only on initial conditions. For higher input dimensions, initial parameters have smaller learning gradients and larger learning times. We find that learning times have supralinear scaling, becoming quickly prohibitive for high input dimensions. These results reveal a fundamental limitation for learning in high dimensions and help elucidate how the optimal design of neural networks depends on data complexity. Our approach outlines a new framework for analyzing learning dynamics and model complexity in neural network models.


Source: arXiv:2603.01184v1 - http://arxiv.org/abs/2603.01184v1 PDF: https://arxiv.org/pdf/2603.01184v1 Original Link: http://arxiv.org/abs/2603.01184v1

Submission:3/4/2026
Comments:0 comments
Subjects:Neuroscience; Neuroscience
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Scaling of learning time for high dimensional inputs | Researchia | Researchia