- Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity Benjamin Grimmer(bdg79cornell.edu) Abstract: We generalize the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions via a new measure of steepness. For the deterministic projected subgradient method, we derive a global $O(1/\sqrt{T})$ convergence rate for any function with at most exponential growth. Our approach implies generalizations of the standard convergence rates for gradient descent on functions with Lipschitz or H\"older continuous gradients. Further, we show a $O(1/\sqrt{T})$ convergence rate for the stochastic projected subgradient method on functions with at most quadratic growth, which improves to $O(1/T)$ under strong convexity. Keywords: Subgradient Method, Convex Optimization, Convergence Rates Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Citation: Download: [PDF]Entry Submitted: 12/11/2017Entry Accepted: 12/11/2017Entry Last Modified: 12/11/2017Modify/Update this entry Visitors Authors More about us Links Subscribe, Unsubscribe Digest Archive Search, Browse the Repository Submit Update Policies Coordinator's Board Classification Scheme Credits Give us feedback Optimization Journals, Sites, Societies Optimization Online is supported by the Mathematical Optmization Society.