-

 

 

 




Optimization Online





 

FAST FIRST-ORDER METHODS FOR COMPOSITE CONVEX OPTIMIZATION WITH BACKTRACKING

Donald Goldfarb (goldfarb***at***columbia.edu)
Katya Scheinberg (katyas***at***lehigh.edu)
Xi Bai (xib210***at***lehigh.edu)

Abstract: We propose new versions of accelerated first order methods for convex composite optimization, where the prox parameter is allowed to increase from one iteration to the next. In particular we show that a full backtracking strategy can be used within the FISTA \cite{Beck-Teboulle-2009} and FALM algorithms \cite{Goldfarb-Ma-Scheinberg-2010} while preserving their worst-case iteration complexities of $O(\sqrt{L(f)/\epsilon})$. In the original versions of FISTA and FALM the prox parameter value on each iteration has to be bounded from above by its value on prior iterations. The complexity of the algorithm then depends on the smallest value of the prox parameter encountered by the algorithm. The theoretical implications of using full backtracking in the framework of accelerated first-order and alternating linearization methods allows for better complexity estimates that depend on the ``average'' prox parameter value. Moreover, we show that in the case of compressed sensing problem and Lasso, the additional cost of the new backtracking strategy is negligible compared to the cost the original FISTA iteration. Our computational results show the benefit of the new algorithm.

Keywords: Convex programming, first-order method, alternating linearization method, iterative shrinkage, optimal gradient method, line search

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation:

Download: [PDF]

Entry Submitted: 04/21/2011
Entry Accepted: 04/21/2011
Entry Last Modified: 10/21/2013

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society