- Homotopy Smoothing for Non-Smooth Problems with Lower Complexity than O(1/epsilon) Yi Xu(yi-xuuiowa.edu) Yan Yan(yan.yan-3student.uts.edu.au) Qihang Lin(qihang-linuiowa.edu) Tianbao Yang(tianbao-yanguiowa.edu) Abstract: In this paper, we develop a novel {\bf ho}moto{\bf p}y {\bf s}moothing (HOPS) algorithm for solving a family of non-smooth problems that is composed of a non-smooth term with an explicit max-structure and a smooth term or a simple non-smooth term whose proximal mapping is easy to compute. The best known iteration complexity for solving such non-smooth optimization problems is $O(1/\epsilon)$ without any assumption on the strong convexity. In this work, we will show that the proposed HOPS achieved a lower iteration complexity of $\widetilde O(1/\epsilon^{1-\theta})$\footnote{$\widetilde O()$ suppresses a logarithmic factor.} with $\theta\in(0,1]$ capturing the local sharpness of the objective function around the optimal solutions. To the best of our knowledge, this is the lowest iteration complexity achieved so far for the considered non-smooth optimization problems without strong convexity assumption. The HOPS algorithm employs Nesterov's smoothing technique and Nesterov's accelerated gradient method and runs in stages, which gradually decreases the smoothing parameter in a stage-wise manner until it yields a sufficiently good approximation of the original function. We show that HOPS enjoys a linear convergence for many well-known non-smooth problems (e.g., empirical risk minimization with a piece-wise linear loss function and $\ell_1$ norm regularizer, finding a point in a polyhedron, cone programming, etc). Experimental results verify the effectiveness of HOPS in comparison with Nesterov's smoothing algorithm and the primal-dual style of first-order methods. Keywords: non-smooth optimization; smoothing; homotopy; local error bound Category 1: Convex and Nonsmooth Optimization Category 2: Convex and Nonsmooth Optimization (Convex Optimization ) Category 3: Convex and Nonsmooth Optimization (Nonsmooth Optimization ) Citation: Neural Information Processing Systems (NIPS) 2016 Download: [PDF]Entry Submitted: 11/01/2016Entry Accepted: 11/01/2016Entry Last Modified: 11/01/2016Modify/Update this entry Visitors Authors More about us Links Subscribe, Unsubscribe Digest Archive Search, Browse the Repository Submit Update Policies Coordinator's Board Classification Scheme Credits Give us feedback Optimization Journals, Sites, Societies Optimization Online is supported by the Mathematical Optmization Society.