-

 

 

 




Optimization Online





 

Simplified Versions of the Conditional Gradient Method

Igor Konnov(konn-igor***at***ya.ru)

Abstract: We suggest simple modifications of the conditional gradient method for smooth optimization problems, which maintain the basic convergence properties, but reduce the implementation cost of each iteration essentially. Namely, we propose the step-size procedure without any line-search, and inexact solution of the direction finding subproblem. Preliminary results of computational tests confirm efficiency of the proposed modifications.

Keywords: Optimization problems; conditional gradient method; simple adaptive step-size

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Nonlinear Optimization (Constrained Nonlinear Optimization )

Citation: http://arxiv.org/abs/1801.05251

Download: [PDF]

Entry Submitted: 01/17/2018
Entry Accepted: 01/17/2018
Entry Last Modified: 01/17/2018

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society