Optimization Online


Fairer Benchmarking of Optimization Algorithms via Derivative Free Optimization

W Hare(warren.hare***at***ubc.ca)
Y Wang(ywang767***at***uwo.ca)

Abstract: Research in optimization algorithm design is often accompanied by benchmarking a new al- gorithm. Some benchmarking is done as a proof-of-concept, by demonstrating the new algorithm works on a small number of dicult test problems. Alternately, some benchmarking is done in order to demonstrate that the new algorithm in someway out-performs previous methods. In this circumstance it is important to note that algorithm performance can heavily depend on the selection of a number of user inputed parameters. In this paper we begin by demonstrating that if algorithms are compared using arbitrary parameter selections, results do not just compare the algorithms, but also compare the authors' ability to select good parameters. We further present a novel technique for generating and providing results using each algorithm's \optimal parameter selection". These optimal parameter selections can be computed using modern derivative free optimization methods and generate a framework for fairer benchmarking between optimization algorithms.

Keywords: benchmarking, optimal parameter selection, Derivative Free Optimization

Category 1: Applications -- OR and Management Sciences (Other )

Category 2: Applications -- Science and Engineering (Other )

Category 3: Global Optimization (Applications )

Citation: submitted

Download: [PDF]

Entry Submitted: 10/13/2010
Entry Accepted: 10/13/2010
Entry Last Modified: 10/13/2010

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society