Iterative Reweighted Linear Least Squares for Exact Penalty Subproblems on Product Sets
James V. Burke (jvburkeuw.edu)
Abstract: We present two matrix-free methods for solving exact penalty subproblems on product sets that arise when solving large-scale optimization problems. The first approach is a novel iterative reweighting algorithm (IRWA), which iteratively minimizes quadratic models of relaxed subproblems while automatically updating a relaxation vector. The second approach is based on alternating direction augmented Lagrangian (ADAL) technology applied to our setting. The main computational costs of each algorithm are the repeated minimizations of convex quadratic functions which can be performed matrix-free. We prove that both algorithms are globally convergent under loose assumptions and that each requires at most O(1/eps^2) iterations to reach eps-optimality of the objective function. Numerical experiments exhibit the ability of both algorithms to efficiently find inexact solutions. However, in certain cases, these experiments indicate that IRWA can be significantly more efficient than ADAL.
Keywords: convex composite optimization, nonlinear optimization, exact penalty methods, iterative reweighting methods, augmented Lagrangian methods, alternating direction methods
Category 1: Nonlinear Optimization
Category 2: Nonlinear Optimization (Constrained Nonlinear Optimization )
Category 3: Convex and Nonsmooth Optimization (Convex Optimization )
Citation: J. V. Burke, F. E. Curtis, H. Wang, and J. Wang, "Iterative Reweighted Linear Least Squares for Exact Penalty Subproblems on Product Sets," SIAM Journal on Optimization, vol. 25, iss. 1, pp. 261-294, 2015.
Entry Submitted: 12/20/2013
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|