Back to Explorer
Research PaperResearchia:202603.06004[Data Science > Machine Learning]

Cheap Thrills: Effective Amortized Optimization Using Inexpensive Labels

Khai Nguyen

Abstract

To scale the solution of optimization and simulation problems, prior work has explored machine-learning surrogates that inexpensively map problem parameters to corresponding solutions. Commonly used approaches, including supervised and self-supervised learning with either soft or hard feasibility enforcement, face inherent challenges such as reliance on expensive, high-quality labels or difficult optimization landscapes. To address their trade-offs, we propose a novel framework that first collects "cheap" imperfect labels, then performs supervised pretraining, and finally refines the model through self-supervised learning to improve overall performance. Our theoretical analysis and merit-based criterion show that labeled data need only place the model within a basin of attraction, confirming that only modest numbers of inexact labels and training epochs are required. We empirically validate our simple three-stage strategy across challenging domains, including nonconvex constrained optimization, power-grid operation, and stiff dynamical systems, and show that it yields faster convergence; improved accuracy, feasibility, and optimality; and up to 59x reductions in total offline cost.


Source: arXiv:2603.05495v1 - http://arxiv.org/abs/2603.05495v1 PDF: https://arxiv.org/pdf/2603.05495v1 Original Link: http://arxiv.org/abs/2603.05495v1

Submission:3/6/2026
Comments:0 comments
Subjects:Machine Learning; Data Science
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Cheap Thrills: Effective Amortized Optimization Using Inexpensive Labels | Researchia