Back to Explorer
Research PaperResearchia:202603.11067[Artificial Intelligence > AI]

Don't Look Back in Anger: MAGIC Net for Streaming Continual Learning with Temporal Dependence

Federico Giannini

Abstract

Concept drift, temporal dependence, and catastrophic forgetting represent major challenges when learning from data streams. While Streaming Machine Learning and Continual Learning (CL) address these issues separately, recent efforts in Streaming Continual Learning (SCL) aim to unify them. In this work, we introduce MAGIC Net, a novel SCL approach that integrates CL-inspired architectural strategies with recurrent neural networks to tame temporal dependence. MAGIC Net continuously learns, looks back at past knowledge by applying learnable masks over frozen weights, and expands its architecture when necessary. It performs all operations online, ensuring inference availability at all times. Experiments on synthetic and real-world streams show that it improves adaptation to new concepts, limits memory usage, and mitigates forgetting.


Source: arXiv:2603.08600v1 - http://arxiv.org/abs/2603.08600v1 PDF: https://arxiv.org/pdf/2603.08600v1 Original Link: http://arxiv.org/abs/2603.08600v1

Submission:3/11/2026
Comments:0 comments
Subjects:AI; Artificial Intelligence
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Don't Look Back in Anger: MAGIC Net for Streaming Continual Learning with Temporal Dependence | Researchia