''Active-set complexity'' of proximal gradient: How long does it take to find the sparsity pattern?
Abstract: Proximal gradient methods have been found to be highly effective for solving minimization problems with non-negative constraints or L1-regularization. Under suitable nondegeneracy conditions, it is known that these algorithms identify the optimal sparsity pattern for these types of problems in a finite number of iterations. However, it is not known how many iterations this may take. We introduce the notion of the ''active-set complexity'', which in these cases is the number of iterations before an algorithm is guaranteed to have identified the final sparsity pattern. We further give a bound on the active-set complexity of proximal gradient methods in the common case of minimizing the sum of a strongly-convex smooth function and a separable convex non-smooth function.
Keywords: proximal gradient methods, active-set identification, active-set complexity
Category 1: Convex and Nonsmooth Optimization
Category 2: Convex and Nonsmooth Optimization (Convex Optimization )
Category 3: Convex and Nonsmooth Optimization (Nonsmooth Optimization )
Citation: Neural Information Processing Systems Optimization Workshop 2017
Entry Submitted: 12/10/2017
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|