A Multi-Layer Line Search Method to Improve the Initialization of Optimization Algorithms

We introduce a novel metaheuristic methodology to improve the initialization of a given deterministic or stochastic optimization algorithm. Our objective is to improve the performance of the considered algorithm, called core optimization algorithm, by reducing its number of cost function evaluations, by increasing its success rate and by boosting the precision of its results. In our approach, the core optimization is considered as a sub-optimization problem for a multi-layer line search method. The approach is presented and implemented for various particular core optimization algorithms: Steepest Descent, Heavy-Ball, Genetic Algorithm, Differential Evolution and Controlled Random Search. We validate our methodology by considering a set of low and high dimensional benchmark problems (i.e., problems of dimension between 2 and 1000). The results are compared to those obtained with the core optimization algorithms alone and with two additional global optimization methods (Direct Tabu Search and Continuous Greedy Randomized Adaptive Search). These latter also aim at improving the initial condition for the core algorithms. The numerical results seem to indicate that our approach improves the performances of the core optimization algorithms and allows generating algorithms more efficient than the other optimization methods studied here. A Matlab optimization package called "Global Optimization Platform" (GOP), implementing the algorithms presented here, has been developed and can be downloaded at: http://www.mat.ucm.es/momat/software.htm

Article

Download

View A Multi-Layer Line Search Method to Improve the Initialization of Optimization Algorithms