-

 

 

 




Optimization Online





 

Use of static surrogates in hyperparameter optimization

Dounia Lakhmiri(dounia.lakhmiri***at***polymtl.ca)
Sébastien Le Digabel(Sebastien.Le.Digabel***at***gerad.ca)

Abstract: Optimizing the hyperparameters and architecture of a neural network is a long yet necessary phase in the development of any new application. This consuming process can benefit from the elaboration of strategies designed to quickly discard low quality configurations and focus on more promising candidates. This work aims at enhancing HyperNOMAD, a library that adapts a direct search derivative-free optimization algorithm to tune both the architecture and the training of a neural network simultaneously, by targeting two keys steps of its execution and exploiting cheap approximations in the form of static surrogates to trigger the early stopping of the evaluation of a configuration and the ranking of pools of candidates. These additions to HyperNOMAD are shown to improve on its resources consumption without harming the quality of the proposed solutions.

Keywords: Hyperparameter optimization (HPO), Derivative-free optimization (DFO), Blackbox optimization (BBO).

Category 1: Nonlinear Optimization

Category 2: Optimization Software and Modeling Systems

Citation: Les Cahiers du GERAD G-2021-XX.

Download: [PDF]

Entry Submitted: 03/14/2021
Entry Accepted: 03/14/2021
Entry Last Modified: 03/14/2021

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society