ExplorerNeuroscienceNeuroscience
Research PaperResearchia:202603.12025

Linear Readout of Neural Manifolds with Continuous Variables

Will Slatton

Abstract

Brains and artificial neural networks compute with continuous variables such as object position or stimulus orientation. However, the complex variability in neural responses makes it difficult to link internal representational structure to task performance. We develop a statistical-mechanical theory of regression capacity that relates linear decoding efficiency of continuous variables to geometric properties of neural manifolds. Our theory handles complex neural variability and applies to real d...

Submitted: March 12, 2026Subjects: Neuroscience; Neuroscience

Description / Details

Brains and artificial neural networks compute with continuous variables such as object position or stimulus orientation. However, the complex variability in neural responses makes it difficult to link internal representational structure to task performance. We develop a statistical-mechanical theory of regression capacity that relates linear decoding efficiency of continuous variables to geometric properties of neural manifolds. Our theory handles complex neural variability and applies to real data, revealing increasing capacity for decoding object position and size along the monkey visual stream.


Source: arXiv:2603.10956v1 - http://arxiv.org/abs/2603.10956v1 PDF: https://arxiv.org/pdf/2603.10956v1 Original Link: http://arxiv.org/abs/2603.10956v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Mar 12, 2026
Topic:
Neuroscience
Area:
Neuroscience
Comments:
0
Bookmark