- The Fastest Known Globally Convergent First-Order Method for Minimizing Strongly Convex Functions Bryan Van Scoy (bryanvanscoy2012u.northwestern.edu) Randy Freeman (freemaneecs.northwestern.edu) Kevin Lynch (kmlynchnorthwestern.edu) Abstract: We design and analyze a novel gradient-based algorithm for unconstrained convex optimization. When the objective function is $m$-strongly convex and its gradient is $L$-Lipschitz continuous, the iterates and function values converge linearly to the optimum at rates $\rho$ and $\rho^2$, respectively, where $\rho = 1-\sqrt{m/L}$. These are the fastest known guaranteed linear convergence rates for globally convergent first-order methods, and for high desired accuracies the corresponding iteration complexity is within a factor of two of the theoretical lower bound. We use a simple graphical design procedure based on integral quadratic constraints to derive closed-form expressions for the algorithm parameters. The new algorithm, which we call the triple momentum method, can be seen as an extension of methods such as gradient descent, Nesterov's accelerated gradient descent, and the heavy-ball method. Keywords: Convex optimization, gradient method Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Citation: @ARTICLE{7967721, author={B. Van Scoy and R. A. Freeman and K. M. Lynch}, journal={IEEE Control Systems Letters}, title={The Fastest Known Globally Convergent First-Order Method for Minimizing Strongly Convex Functions}, year={2017}, volume={PP}, number={99}, pages={1-1}, keywords={Acceleration;Algorithm design and analysis;Complexity theory;Convergence;Linear programming;Optimization;Transfer functions;optimization algorithms;robust control.}, doi={10.1109/LCSYS.2017.2722406}, ISSN={2475-1456}, month={},} Download: Entry Submitted: 03/17/2017Entry Accepted: 03/17/2017Entry Last Modified: 07/05/2017Modify/Update this entry Visitors Authors More about us Links Subscribe, Unsubscribe Digest Archive Search, Browse the Repository Submit Update Policies Coordinator's Board Classification Scheme Credits Give us feedback Optimization Journals, Sites, Societies Optimization Online is supported by the Mathematical Optmization Society.