Back to Explorer
Research PaperResearchia:202603.11034[Mathematics > Mathematics]

OptEMA: Adaptive Exponential Moving Average for Stochastic Optimization with Zero-Noise Optimality

Ganzhao Yuan

Abstract

The Exponential Moving Average (EMA) is a cornerstone of widely used optimizers such as Adam. However, existing theoretical analyses of Adam-style methods have notable limitations: their guarantees can remain suboptimal in the zero-noise regime, rely on restrictive boundedness conditions (e.g., bounded gradients or objective gaps), use constant or open-loop stepsizes, or require prior knowledge of Lipschitz constants. To overcome these bottlenecks, we introduce OptEMA and analyze two novel variants: OptEMA-M, which applies an adaptive, decreasing EMA coefficient to the first-order moment with a fixed second-order decay, and OptEMA-V, which swaps these roles. Crucially, OptEMA is closed-loop and Lipschitz-free in the sense that its effective stepsizes are trajectory-dependent and do not require the Lipschitz constant for parameterization. Under standard stochastic gradient descent (SGD) assumptions, namely smoothness, a lower-bounded objective, and unbiased gradients with bounded variance, we establish rigorous convergence guarantees. Both variants achieve a noise-adaptive convergence rate of O~(Tβˆ’1/2+Οƒ1/2Tβˆ’1/4)\widetilde{\mathcal{O}}(T^{-1/2}+Οƒ^{1/2} T^{-1/4}) for the average gradient norm, where σσ is the noise level. In particular, in the zero-noise regime where Οƒ=0Οƒ=0, our bounds reduce to the nearly optimal deterministic rate O~(Tβˆ’1/2)\widetilde{\mathcal{O}}(T^{-1/2}) without manual hyperparameter retuning.


Source: arXiv:2603.09923v1 - http://arxiv.org/abs/2603.09923v1 PDF: https://arxiv.org/pdf/2603.09923v1 Original Link: http://arxiv.org/abs/2603.09923v1

Submission:3/11/2026
Comments:0 comments
Subjects:Mathematics; Mathematics
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!