-

 

 

 




Optimization Online





 

''Active-set complexity'' of proximal gradient: How long does it take to find the sparsity pattern?

Julie Nutini(jnutini***at***cs.ubc.ca)
Mark Schmidt(schmidtm***at***cs.ubc.ca)
Warren Hare(warren.hare***at***ubc.ca)

Abstract: Proximal gradient methods have been found to be highly effective for solving minimization problems with non-negative constraints or L1-regularization. Under suitable nondegeneracy conditions, it is known that these algorithms identify the optimal sparsity pattern for these types of problems in a finite number of iterations. However, it is not known how many iterations this may take. We introduce the notion of the ''active-set complexity'', which in these cases is the number of iterations before an algorithm is guaranteed to have identified the final sparsity pattern. We further give a bound on the active-set complexity of proximal gradient methods in the common case of minimizing the sum of a strongly-convex smooth function and a separable convex non-smooth function.

Keywords: proximal gradient methods, active-set identification, active-set complexity

Category 1: Convex and Nonsmooth Optimization

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Category 3: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: Neural Information Processing Systems Optimization Workshop 2017

Download: [PDF]

Entry Submitted: 12/10/2017
Entry Accepted: 12/10/2017
Entry Last Modified: 12/10/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society