-

 

 

 




Optimization Online





 

Robust Block Coordinate Descent

Kimon Fountoulakis (K.Fountoulakis***at***sms.ed.ac.uk)
Rachael Tappenden (r.tappenden***at***ed.ac.uk)

Abstract: In this paper we present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly nonseparable or ill conditioned problems. We call the method Robust Coordinate Descent (RCD). At each iteration of RCD, a block of coordinates is sampled randomly, a quadratic model is formed about that block and the model is minimized approximately/inexactly to determine the search direction. An inexpensive line search is then employed to ensure a monotonic decrease in the objective function and acceptance of large step sizes. We prove global convergence of the RCD algorithm, and we also present several results on the local convergence of RCD for strongly convex functions. Finally, we present numerical results on large-scale problems to demonstrate the practical performance of the method.

Keywords: large scale optimization, second-order methods, curvature information, block coordinate descent, nonsmooth problems

Category 1: Nonlinear Optimization

Category 2: Convex and Nonsmooth Optimization

Category 3: Nonlinear Optimization (Unconstrained Optimization )

Citation: Technical Report ERGO 14-010, University of Edinburgh, School of Mathematics, James Clerk Maxwell Building, King's Buildings, EH9 3JZ Edinburgh, UK, July 2014

Download: [PDF]

Entry Submitted: 07/27/2014
Entry Accepted: 07/27/2014
Entry Last Modified: 05/08/2015

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society