Explorerβ€ΊData Scienceβ€ΊStatistics
Research PaperResearchia:202604.14030

Sharp description of local minima in the loss landscape of high-dimensional two-layer ReLU neural networks

Jie Huang

Abstract

We study the population loss landscape of two-layer ReLU networks of the form $\sum_{k=1}^K \mathrm{ReLU}(w_k^\top x)$ in a realisable teacher-student setting with Gaussian covariates. We show that local minima admit an exact low-dimensional representation in terms of summary statistics, yielding a sharp and interpretable characterisation of the landscape. We further establish a direct link with one-pass SGD: local minima correspond to attractive fixed points of the dynamics in summary statistic...

Submitted: April 14, 2026Subjects: Statistics; Data Science

Description / Details

We study the population loss landscape of two-layer ReLU networks of the form βˆ‘k=1KReLU(wk⊀x)\sum_{k=1}^K \mathrm{ReLU}(w_k^\top x) in a realisable teacher-student setting with Gaussian covariates. We show that local minima admit an exact low-dimensional representation in terms of summary statistics, yielding a sharp and interpretable characterisation of the landscape. We further establish a direct link with one-pass SGD: local minima correspond to attractive fixed points of the dynamics in summary statistics space. This perspective reveals a hierarchical structure of minima: they are typically isolated in the well-specified regime, but become connected by flat directions as network width increases. In this overparameterised regime, global minima become increasingly accessible, attracting the dynamics and reducing convergence to spurious solutions. Overall, our results reveal intrinsic limitations of common simplifying assumptions, which may miss essential features of the loss landscape even in minimal neural network models.


Source: arXiv:2604.09412v1 - http://arxiv.org/abs/2604.09412v1 PDF: https://arxiv.org/pdf/2604.09412v1 Original Link: http://arxiv.org/abs/2604.09412v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Apr 14, 2026
Topic:
Data Science
Area:
Statistics
Comments:
0
Bookmark
Sharp description of local minima in the loss landscape of high-dimensional two-layer ReLU neural networks | Researchia