Working Memory in a Recurrent Spiking Neural Networks With Heterogeneous Synaptic Delays
Abstract
Working memory -- the ability to store and recall precise temporal patterns of neural activity -- remains an open challenge for spiking neural networks (SNNs). We propose a recurrent SNN of $N$ neurons in which each synapse is equipped with $D = 41$ delays, modelled as a weight tensor $\mathbf{W} \in \mathbb{R}^{N \times N \times D}$ and trained end-to-end with surrogate-gradient backpropagation through time. The network stores $M$ arbitrary target spike patterns by representing each as a sequen...
Description / Details
Working memory -- the ability to store and recall precise temporal patterns of neural activity -- remains an open challenge for spiking neural networks (SNNs). We propose a recurrent SNN of neurons in which each synapse is equipped with delays, modelled as a weight tensor and trained end-to-end with surrogate-gradient backpropagation through time. The network stores arbitrary target spike patterns by representing each as a sequential chain of overlapping Spiking Motifs: contiguous windows of length that uniquely predict spikes at the next time step. On a synthetic benchmark of patterns ( neurons, steps), training achieves a mean F1 score of , with recall emerging first near the clamped initialisation window and propagating forward in time. This result demonstrates that heterogeneous delays provide an efficient substrate for working memory in SNNs, enabling energy-efficient neuromorphic edge deployment.
Source: arXiv:2604.14096v1 - http://arxiv.org/abs/2604.14096v1 PDF: https://arxiv.org/pdf/2604.14096v1 Original Link: http://arxiv.org/abs/2604.14096v1
Please sign in to join the discussion.
No comments yet. Be the first to share your thoughts!
Apr 17, 2026
Neuroscience
Neuroscience
0