Optimization Online


Globally Solving the Trust Region Subproblem Using Simple First-Order Methods

Amir Beck(becka***at***tauex.tau.ac.il)
Yakov Vaisbourd(yakov.vaisbourd***at***gmail.com)

Abstract: We consider the trust region subproblem which is given by a minimization of a quadratic, not necessarily convex, function over the Euclidean ball. Based on the well-known second-order necessary and sufficient optimality conditions for this problem, we present two sufficient optimality conditions defined solely in terms of the primal variables. Each of these conditions corresponds to one of two possible scenarios that occur in this problem, commonly referred to in the literature as the presence or absence of the ``hard case". We consider a family of first-order methods, which includes the projected and conditional gradient methods. We show that any method belonging to this family produces a sequence which is guaranteed to converge to a stationary point of the trust region subproblem. Based on this result and the established sufficient optimality conditions, we show that convergence to an optimal solution can be also guaranteed as long as the method is properly initialized. In particular, if the method is initialized with the zeros vector and reinitialized with a randomly generated feasible point, then the best of the two obtained vectors is an optimal solution of the problem in probability 1.

Keywords: trust region sub-problem, first order methods, conditional gradient, projected gradient, non-convex optimization

Category 1: Nonlinear Optimization (Constrained Nonlinear Optimization )


Download: [PDF]

Entry Submitted: 10/02/2017
Entry Accepted: 10/02/2017
Entry Last Modified: 10/02/2017

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society