Abstract
We propose CDONE, a convex version of the DONE algorithm. DONE is a derivative-free online optimization algorithm that uses surrogate modeling with noisy measurements to find a minimum of objective functions that are expensive to evaluate. Inspired by their success in deep learning, CDONE makes use of rectified linear units, together with a nonnegativity constraint to enforce convexity of the surrogate model. This leads to a sparse and cheap to evaluate surrogate model of the unknown optimization objective that is still accurate and that can be minimized with convex optimization algorithms. The CDONE algorithm is demonstrated on a toy example and on the problem of hyper-parameter optimization for a deep learning example on handwritten digit classification.
Original language | English |
---|---|
Title of host publication | Proceedings 2017 27th IEEE International Workshop on Machine Learning for Signal Processing (MLSP 2017) |
Editors | N. Ueda, T. Matsui |
Place of Publication | Piscataway, NJ, USA |
Publisher | IEEE |
Number of pages | 6 |
ISBN (Electronic) | 9781509063413 |
DOIs | |
Publication status | Published - 2017 |
Event | 2017 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2017 - Tokyo, Japan Duration: 25 Sep 2017 → 28 Sep 2017 |
Conference
Conference | 2017 IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2017 |
---|---|
Country/Territory | Japan |
City | Tokyo |
Period | 25/09/17 → 28/09/17 |
Keywords
- Bayesian optimization
- Deep learning
- Derivative-free optimization
- Surrogate modeling