A Theoretical and Empirical Taxonomy of Imbalance in Binary Classification
Abstract
Class imbalance significantly degrades classification performance, yet its effects are rarely analyzed from a unified theoretical perspective. We propose a principled framework based on three fundamental scales: the imbalance coefficient , the sample--dimension ratio , and the intrinsic separability . Starting from the Gaussian Bayes classifier, we derive closed-form Bayes errors and show how imbalance shifts the discriminant boundary, yielding a deterioration slope that predicts four regimes: Normal, Mild, Extreme, and Catastrophic. Using a balanced high-dimensional genomic dataset, we vary only while keeping and fixed. Across parametric and non-parametric models, empirical degradation closely follows theoretical predictions: minority Recall collapses once exceeds , Precision increases asymmetrically, and F1-score and PR-AUC decline in line with the predicted regimes. These results show that the triplet provides a model-agnostic, geometrically grounded explanation of imbalance-induced deterioration.