ExplorerData ScienceMachine Learning
Research PaperResearchia:202604.15052

KL Divergence Between Gaussians: A Step-by-Step Derivation for the Variational Autoencoder Objective

Andrés Muñoz

Abstract

Kullback-Leibler (KL) divergence is a fundamental concept in information theory that quantifies the discrepancy between two probability distributions. In the context of Variational Autoencoders (VAEs), it serves as a central regularization term, imposing structure on the latent space and thereby enabling the model to exhibit generative capabilities. In this work, we present a detailed derivation of the closed-form expression for the KL divergence between Gaussian distributions, a case of particu...

Submitted: April 15, 2026Subjects: Machine Learning; Data Science

Description / Details

Kullback-Leibler (KL) divergence is a fundamental concept in information theory that quantifies the discrepancy between two probability distributions. In the context of Variational Autoencoders (VAEs), it serves as a central regularization term, imposing structure on the latent space and thereby enabling the model to exhibit generative capabilities. In this work, we present a detailed derivation of the closed-form expression for the KL divergence between Gaussian distributions, a case of particular importance in practical VAE implementations. Starting from the general definition for continuous random variables, we derive the expression for the univariate case and extend it to the multivariate setting under the assumption of diagonal covariance. Finally, we discuss the interpretation of each term in the resulting expression and its impact on the training dynamics of the model.


Source: arXiv:2604.11744v1 - http://arxiv.org/abs/2604.11744v1 PDF: https://arxiv.org/pdf/2604.11744v1 Original Link: http://arxiv.org/abs/2604.11744v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
Apr 15, 2026
Topic:
Data Science
Area:
Machine Learning
Comments:
0
Bookmark