Back to Explorer
Research PaperResearchia:202601.29176[Optimization > Mathematics]

Manifold constrained steepest descent

Kaiwei Yang

Abstract

Norm-constrained linear minimization oracle (LMO)-based optimizers such as spectral gradient descent and Muon are attractive in large-scale learning, but extending them to manifold-constrained problems is nontrivial and often leads to nested-loop schemes that solve tangent-space subproblems iteratively. We propose \emph{Manifold Constrained Steepest Descent} (MCSD), a single-loop framework for optimization over manifolds that selects a norm-induced steepest-descent direction via an LMO applied to the Riemannian gradient, and then returns to the manifold via projection. Under standard smoothness assumptions, we establish convergence guarantees for MCSD and a stochastic momentum variant. We further introduce \emph{SPEL}, the spectral-norm specialization of MCSD on the Stiefel manifold, which admits scalable implementations via fast matrix sign computations. Experiments on PCA, orthogonality-constrained CNNs, and manifold-constrained LLM adapter tuning demonstrate improved stability and competitive performance relative to standard Riemannian baselines and existing manifold-aware LMO methods.


Source: arXiv:2601.21487v1 - http://arxiv.org/abs/2601.21487v1 PDF: https://arxiv.org/pdf/2601.21487v1 Original Link: http://arxiv.org/abs/2601.21487v1

Submission:1/29/2026
Comments:0 comments
Subjects:Mathematics; Optimization
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Manifold constrained steepest descent | Researchia