-

 

 

 




Optimization Online





 

New results on subgradient methods for strongly convex optimization problems with a unified analysis

Ito Masaru (ito1***at***is.titech.ac.jp)

Abstract: We develop subgradient- and gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framework provides tools to analyze the convergence of PGMs and CGMs for non-smooth, (weakly) smooth, and further for structured problems such as the inexact oracle models. The proposed subgradient methods yield optimal PGMs for several classes of problems and yield optimal and nearly optimal CGMs for smooth and weakly smooth problems, respectively.

Keywords: non-smooth/smooth convex optimization, structured convex optimization, subgradient/gradient-based proximal method, conditional gradient method, complexity theory, strongly convex functions, weakly smooth functions

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 3: Nonlinear Optimization (Constrained Nonlinear Optimization )

Citation: Research Report B-479, Department of Mathematical and Computing Sciences, Tokyo Institute of Technology, April 2015

Download: [PDF]

Entry Submitted: 04/14/2015
Entry Accepted: 04/14/2015
Entry Last Modified: 12/04/2015

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society