Back to Explorer
Research PaperResearchia:202601.29199[Statistics & ML > Statistics]

LoRA and Privacy: When Random Projections Help (and When They Don't)

Yaxi Hu

Abstract

We introduce the (Wishart) projection mechanism, a randomized map of the form SMf(S)S \mapsto M f(S) with MWd(1/rId,r)M \sim W_d(1/r I_d, r) and study its differential privacy properties. For vector-valued queries ff, we prove non-asymptotic DP guarantees without any additive noise, showing that Wishart randomness alone can suffice. For matrix-valued queries, however, we establish a sharp negative result: in the noise-free setting, the mechanism is not DP, and we demonstrate its vulnerability by implementing a near perfect membership inference attack (AUC >0.99> 0.99). We then analyze a noisy variant and prove privacy amplification due to randomness and low rank projection, in both large- and small-rank regimes, yielding stronger privacy guarantees than additive noise alone. Finally, we show that LoRA-style updates are an instance of the matrix-valued mechanism, implying that LoRA is not inherently private despite its built-in randomness, but that low-rank fine-tuning can be more private than full fine-tuning at the same noise level. Preliminary experiments suggest that tighter accounting enables lower noise and improved accuracy in practice.


Source: arXiv:2601.21719v1 - http://arxiv.org/abs/2601.21719v1 PDF: https://arxiv.org/pdf/2601.21719v1 Original Link: http://arxiv.org/abs/2601.21719v1

Submission:1/29/2026
Comments:0 comments
Subjects:Statistics; Statistics & ML
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

LoRA and Privacy: When Random Projections Help (and When They Don't) | Researchia