Back to Explorer
Research PaperResearchia:202602.03118[Data Science > Machine Learning]

PLATE: Plasticity-Tunable Efficient Adapters for Geometry-Aware Continual Learning

Romain Cosentino

Abstract

We develop a continual learning method for pretrained models that \emph{requires no access to old-task data}, addressing a practical barrier in foundation model adaptation where pretraining distributions are often unavailable. Our key observation is that pretrained networks exhibit substantial \emph{geometric redundancy}, and that this redundancy can be exploited in two complementary ways. First, redundant neurons provide a proxy for dominant pretraining-era feature directions, enabling the construction of approximately protected update subspaces directly from pretrained weights. Second, redundancy offers a natural bias for \emph{where} to place plasticity: by restricting updates to a subset of redundant neurons and constraining the remaining degrees of freedom, we obtain update families with reduced functional drift on the old-data distribution and improved worst-case retention guarantees. These insights lead to \textsc{PLATE} (\textbf{Pla}sticity-\textbf{T}unable \textbf{E}fficient Adapters), a continual learning method requiring no past-task data that provides explicit control over the plasticity-retention trade-off. PLATE parameterizes each layer with a structured low-rank update ΔW=BAQΔW = B A Q^\top, where BB and QQ are computed once from pretrained weights and kept frozen, and only AA is trained on the new task. The code is available at https://github.com/SalesforceAIResearch/PLATE.


Source: arXiv:2602.03846v1 - http://arxiv.org/abs/2602.03846v1 PDF: https://arxiv.org/pdf/2602.03846v1 Original Article: View on arXiv

Submission:2/3/2026
Comments:0 comments
Subjects:Machine Learning; Data Science
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!