ExplorerData ScienceMachine Learning
Research PaperResearchia:202603.24066

Gumbel Distillation for Parallel Text Generation

Chi Zhang

Abstract

The slow, sequential nature of autoregressive (AR) language models has driven the adoption of parallel decoding methods. However, these non-AR models often sacrifice generation quality as they struggle to model the complex joint distribution of token sequences. To narrow this performance gap, we introduce Gumbel Distillation, a novel distillation technique that enables parallel decoders to learn this distribution effectively. Our method leverages the Gumbel-Max trick to create a deterministic ma...

Submitted: March 24, 2026Subjects: Machine Learning; Data Science

Description / Details

The slow, sequential nature of autoregressive (AR) language models has driven the adoption of parallel decoding methods. However, these non-AR models often sacrifice generation quality as they struggle to model the complex joint distribution of token sequences. To narrow this performance gap, we introduce Gumbel Distillation, a novel distillation technique that enables parallel decoders to learn this distribution effectively. Our method leverages the Gumbel-Max trick to create a deterministic mapping from a latent Gumbel noise space to the output tokens of a high-performing AR teacher. As a model-agnostic technique, Gumbel Distillation seamlessly integrates with diverse parallel decoding architectures, including MDLM and BD3-LM. Experiments on LM1B and OpenWebText show that Gumbel Distillation substantially improves the generation quality of parallel language models, achieving a 30.0% improvement in MAUVE score and 10.5% in generative perplexity over MDLM trained on OpenWebText dataset. Code available at https://github.com/hxixixh/gumbel-distill.


Source: arXiv:2603.22216v1 - http://arxiv.org/abs/2603.22216v1 PDF: https://arxiv.org/pdf/2603.22216v1 Original Link: http://arxiv.org/abs/2603.22216v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Mar 24, 2026
Topic:
Data Science
Area:
Machine Learning
Comments:
0
Bookmark
Gumbel Distillation for Parallel Text Generation | Researchia