TY - UNPB
T1 - Adaptive Smoothing with Restarting for Stochastic Convex Optimization
AU - Rahimi Baghbadorani, R.
AU - Grammatico, S.
AU - Mohajerin Esfahani, P.
PY - 2025
Y1 - 2025
N2 - First-order methods are the most popular methods for solving nonsmooth convex optimization problems. A standard approach in the literature is to smooth the objective function up to a priori precision and then deploy an accelerated first-order algorithm. Recently, there has been resurgent interest in using a restarting framework with first-order algorithms. In this paper, we propose a novel adaptive smoothing rule with restarting that enjoys provable convergence guarantees for both deterministic and stochastic nonsmooth convex minimization. We further extend the result to the quasi-Newton method and apply it within the restarting framework, with provable locally convergence for some classes of functions. Numerically, we benchmark the performance of our proposed algorithms against the state-ofthe-art literature across different classes of nonsmooth convex minimization problems, including regression with the ℓ1-norm (the Lasso problem), sparse semidefinite programming (the MaxCut problem), and nuclear norm minimization.
AB - First-order methods are the most popular methods for solving nonsmooth convex optimization problems. A standard approach in the literature is to smooth the objective function up to a priori precision and then deploy an accelerated first-order algorithm. Recently, there has been resurgent interest in using a restarting framework with first-order algorithms. In this paper, we propose a novel adaptive smoothing rule with restarting that enjoys provable convergence guarantees for both deterministic and stochastic nonsmooth convex minimization. We further extend the result to the quasi-Newton method and apply it within the restarting framework, with provable locally convergence for some classes of functions. Numerically, we benchmark the performance of our proposed algorithms against the state-ofthe-art literature across different classes of nonsmooth convex minimization problems, including regression with the ℓ1-norm (the Lasso problem), sparse semidefinite programming (the MaxCut problem), and nuclear norm minimization.
M3 - Preprint
BT - Adaptive Smoothing with Restarting for Stochastic Convex Optimization
ER -