Back to Explorer
Research PaperResearchia:202601.29181[Optimization > Mathematics]

Perceptrons and localization of attention's mean-field landscape

Antonio Álvarez-López

Abstract

The forward pass of a Transformer can be seen as an interacting particle system on the unit sphere: time plays the role of layers, particles that of token embeddings, and the unit sphere idealizes layer normalization. In some weight settings the system can even be seen as a gradient flow for an explicit energy, and one can make sense of the infinite context length (mean-field) limit thanks to Wasserstein gradient flows. In this paper we study the effect of the perceptron block in this setting, and show that critical points are generically atomic and localized on subsets of the sphere.


Source: arXiv:2601.21366v1 - http://arxiv.org/abs/2601.21366v1 PDF: https://arxiv.org/pdf/2601.21366v1 Original Link: http://arxiv.org/abs/2601.21366v1

Submission:1/29/2026
Comments:0 comments
Subjects:Mathematics; Optimization
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Perceptrons and localization of attention's mean-field landscape | Researchia