Back to Explorer
Research PaperResearchia:202602.24080[Data Science > Machine Learning]

Subgroups of $U(d)$ Induce Natural RNN and Transformer Architectures

Joshua Nunley

Abstract

This paper presents a direct framework for sequence models with hidden states on closed subgroups of U(d). We use a minimal axiomatic setup and derive recurrent and transformer templates from a shared skeleton in which subgroup choice acts as a drop-in replacement for state space, tangent projection, and update map. We then specialize to O(d) and evaluate orthogonal-state RNN and transformer models on Tiny Shakespeare and Penn Treebank under parameter-matched settings. We also report a general linear-mixing extension in tangent space, which applies across subgroup choices and improves finite-budget performance in the current O(d) experiments.


Source: arXiv:2602.18417v1 - http://arxiv.org/abs/2602.18417v1 PDF: https://arxiv.org/pdf/2602.18417v1 Original Link: http://arxiv.org/abs/2602.18417v1

Submission:2/24/2026
Comments:0 comments
Subjects:Machine Learning; Data Science
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!