Optimization Online


Accelerated first-order methods for large-scale convex minimization

Masoud Ahookhosh (masoud.ahookhosh***at***univie.ac.at)

Abstract: This paper discusses several (sub)gradient methods attaining the optimal complexity for smooth problems with Lipschitz continuous gradients, nonsmooth problems with bounded variation of subgradients, weakly smooth problems with H\"older continuous gradients. The proposed schemes are optimal for smooth strongly convex problems with Lipschitz continuous gradients and optimal up to a logarithmic factor for nonsmooth problems with bounded variation of subgradients. More specifically, we propose two estimation sequences of the objective and give two iterative schemes for each of them. In both cases, the first scheme requires the smoothness parameter and the H\"older constant, while the second scheme is parameter-free (except for the strong convexity parameter which we set zero if it is not available) at the price of applying a nonmonotone backtracking line search. A complexity analysis for all the proposed schemes is given. Numerical results for some applications in sparse optimization and machine learning are reported, which confirm the theoretical foundations.

Keywords: Structured convex optimization, Strong convexity, Nonsmooth optimization, First-order black-box oracle, Optimal complexity, High-dimensional data

Category 1: Applications -- OR and Management Sciences

Category 2: Convex and Nonsmooth Optimization

Category 3: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Faculty of Mathematics, University of Vienna, April 2016

Download: [Postscript][PDF]

Entry Submitted: 04/29/2016
Entry Accepted: 04/29/2016
Entry Last Modified: 05/13/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society