Optimization Online


Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server

Arda Aytekin(aytekin***at***kth.se)
Hamid Reza Feyzmahdavian(hamidrez***at***kth.se)
Mikael Johansson(mikaelj***at***kth.se)

Abstract: This paper presents an asynchronous incremental aggregated gradient algorithm and its implementation in a parameter server framework for solving regularized optimization problems. The algorithm can handle both general convex (possibly non-smooth) regularizers and general convex constraints. When the empirical data loss is strongly convex, we establish linear convergence rate, give explicit expressions for step-size choices that guarantee convergence to the optimum, and bound the associated convergence factors. The expressions have an explicit dependence on the degree of asynchrony and recover classical results under synchronous operation. Simulations and implementations on commercial compute clouds validate our findings.

Keywords: asynchronous, proximal, incremental, aggregated gradient, linear convergence

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization

Category 3: Optimization Software and Modeling Systems (Parallel Algorithms )

Citation: Department of Automatic Control, KTH Royal Institute of Technology, 20161018-AFJ

Download: [PDF]

Entry Submitted: 10/18/2016
Entry Accepted: 10/18/2016
Entry Last Modified: 10/18/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society