Back to Explorer
Research PaperResearchia:202603.05019[Neuroscience > Neuroscience]

A Dynamical Theory of Sequential Retrieval in Input-Driven Hopfield Networks

Simone Betteti

Abstract

Reasoning is the ability to integrate internal states and external inputs in a meaningful and semantically consistent flow. Contemporary machine learning (ML) systems increasingly rely on such sequential reasoning, from language understanding to multi-modal generation, often operating over dictionaries of prototypical patterns reminiscent of associative memory models. Understanding retrieval and sequentiality in associative memory models provides a powerful bridge to gain insight into ML reasoning. While the static retrieval properties of associative memory models are well understood, the theoretical foundations of sequential retrieval and multi-memory integration remain limited, with existing studies largely relying on numerical evidence. This work develops a dynamical theory of sequential reasoning in Hopfield networks. We consider the recently proposed input-driven plasticity (IDP) Hopfield network and analyze a two-timescale architecture coupling fast associative retrieval with slow reasoning dynamics. We derive explicit conditions for self-sustained memory transitions, including gain thresholds, escape times, and collapse regimes. Together, these results provide a principled mathematical account of sequentiality in associative memory models, bridging classical Hopfield dynamics and modern reasoning architectures.


Source: arXiv:2603.03201v1 - http://arxiv.org/abs/2603.03201v1 PDF: https://arxiv.org/pdf/2603.03201v1 Original Link: http://arxiv.org/abs/2603.03201v1

Submission:3/5/2026
Comments:0 comments
Subjects:Neuroscience; Neuroscience
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!