Optimization Online


HyperNOMAD: Hyperparameter optimization of deep neural networks using mesh adaptive direct search

Dounia Lakhmiri(Dounia.Lakhmiri***at***gerad.ca)
Sébastien Le Digabel(Sebastien.Le.Digabel***at***gerad.ca)
Christophe Tribes(Christophe.Tribes***at***gerad.ca)

Abstract: The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time consuming process that is often described as a "dark art". This explains the necessity of automating the calibration of these hyperparameters. Derivative-free optimization is a field that develops methods designed to optimize time consuming functions without relying on derivatives. This work introduces the HyperNOMAD package, an extension of the NOMAD software that applies the MADS algorithm to simultaneously tune the hyperparameters responsible for both the architecture and the learning process of a deep neural network (DNN), and that allows for an important flexibility in the exploration of the search space by taking advantage of categorical variables. This new approach is tested on the MNIST and CIFAR-10 data sets and achieves results comparable to the current state of the art.

Keywords: Deep neural networks, neural architecture search, hyperparameter optimization, blackbox optimization, derivative-free optimization, mesh adaptive direct search, categorical variables.

Category 1: Nonlinear Optimization

Category 2: Optimization Software and Modeling Systems

Citation: Technical report, Les Cahiers du GERAD, 2019.

Download: [PDF]

Entry Submitted: 07/02/2019
Entry Accepted: 07/02/2019
Entry Last Modified: 07/02/2019

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society