Optimization Online


A Flexible Coordinate Descent Method for Big Data Applications

Kimon Fountoulakis (kfount***at***berkeley.edu)
Rachael Tappenden (rtappen1***at***jhu.edu)

Abstract: We present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly nonseparable or ill conditioned problems. We call the method Flexible Coordinate Descent (FCD). At each iteration of FCD, a block of coordinates is sampled randomly, a quadratic model is formed about that block and the model is minimized approximately/inexactly to determine the search direction. An inexpensive line search is then employed to ensure a monotonic decrease in the objective function and acceptance of large step sizes. We present several high probability iteration complexity results to show that convergence of FCD is guaranteed theoretically. Finally, we present numerical results on large-scale problems to demonstrate the practical performance of the method.

Keywords: large scale optimization; second-order methods; curvature information; block coordinate descent; nonsmooth problems; iteration complexity

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization


Download: [PDF]

Entry Submitted: 07/13/2015
Entry Accepted: 07/14/2015
Entry Last Modified: 01/22/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society