ExplorerArtificial IntelligenceAI
Research PaperResearchia:202604.23010

Working Memory Constraints Scaffold Learning in Transformers under Data Scarcity

Pranava Madhyastha

Abstract

We investigate the integration of human-like working memory constraints into the Transformer architecture and implement several cognitively inspired attention variants, including fixed-width windows based and temporal decay based attention mechanisms. Our modified GPT-2 models are trained from scratch on developmentally plausible datasets (10M and 100M words). Performance is evaluated on grammatical judgment tasks (BLiMP) and alignment with human reading time data. Our results indicate that thes...

Submitted: April 23, 2026Subjects: AI; Artificial Intelligence

Description / Details

We investigate the integration of human-like working memory constraints into the Transformer architecture and implement several cognitively inspired attention variants, including fixed-width windows based and temporal decay based attention mechanisms. Our modified GPT-2 models are trained from scratch on developmentally plausible datasets (10M and 100M words). Performance is evaluated on grammatical judgment tasks (BLiMP) and alignment with human reading time data. Our results indicate that these cognitively-inspired constraints, particularly fixed-width attention, can significantly improve grammatical accuracy especially when training data is scarce. These constrained models also tend to show a stronger alignment with human processing metrics. The findings suggest that such constraints may serve as a beneficial inductive bias, guiding models towards more robust linguistic representations, especially in data-limited settings.


Source: arXiv:2604.20789v1 - http://arxiv.org/abs/2604.20789v1 PDF: https://arxiv.org/pdf/2604.20789v1 Original Link: http://arxiv.org/abs/2604.20789v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Apr 23, 2026
Topic:
Artificial Intelligence
Area:
AI
Comments:
0
Bookmark
Working Memory Constraints Scaffold Learning in Transformers under Data Scarcity | Researchia