go-$m$HC: Direct Parameterization of Manifold-Constrained Hyper-Connections via Generalized Orthostochastic Matrices
Abstract
Doubly stochastic matrices enable learned mixing across residual streams, but parameterizing the set of doubly stochastic matrices (the Birkhoff polytope) exactly and efficiently remains an open challenge. Existing exact methods scale factorially with the number of streams (), while Kronecker-factorized approaches are efficient but expressivity-limited. We introduce a novel exact parameterization grounded in the theory of generalized orthostochastic matrices, which scales as and exposes a single hyperparameter which continuously interpolates between a computationally efficient boundary and the fully expressive Birkhoff polytope. Building on Manifold-Constrained Hyper-Connections (HC), a framework for learned dynamic layer connectivity, we instantiate this parameterization in go-HC. Our method composes naturally with Kronecker-factorized methods, substantially recovering expressivity at similar FLOP costs. Spectral analysis indicates that go-HC fills the Birkhoff polytope far more completely than Kronecker-factorized baselines. On synthetic stream-mixing tasks, go-HC achieves the minimum theoretical loss while converging up to faster. We validate our approach in a 30M parameter GPT-style language model. The expressivity, efficiency, and exactness of go-HC offer a practical avenue for scaling as a new dimension of model capacity.
Source: arXiv:2604.02309v1 - http://arxiv.org/abs/2604.02309v1 PDF: https://arxiv.org/pdf/2604.02309v1 Original Link: http://arxiv.org/abs/2604.02309v1