  


Fast convex optimization via inertial dynamics with Hessian driven damping
H Attouch(hedy.attouchunivmontp2.fr) Abstract: We first study the fast minimization properties of the trajectories of the secondorder evolution equation \begin{equation*} \ddot{x}(t) + \frac{\alpha}{t} \dot{x}(t) + \beta \nabla^2 \Phi (x(t))\dot{x} (t) + \nabla \Phi (x(t)) = 0, \end{equation*} where $\Phi : \mathcal H \to \mathbb R$ is a smooth convex function acting on a real Hilbert space $\mathcal H$, and $\alpha$, $\beta$ are positive parameters. This inertial system combines an isotropic viscous damping which vanishes asymptotically, and a geometrical Hessian driven damping, which makes it naturally related to Newton's and LevenbergMarquardt methods. For $\alpha\geq 3$, and $\beta >0$, along any trajectory, fast convergence of the values \begin{align*} \Phi(x(t)) \min_{\mathcal H}\Phi =\mathcal O\left(t^{2}\right) \end{align*} is obtained, together with rapid convergence of the gradients $\nabla \Phi (x(t))$ to zero. For $\alpha > 3$, just assuming that $\argmin \Phi \neq \emptyset,$ we show that any trajectory converges weakly to a minimizer of $\Phi$, and that $ \Phi(x(t)) \min_{\mathcal H}\Phi = o(t^{2})$. Strong convergence is established in various practical situations. In particular, for the strongly convex case, we obtain an even faster speed of convergence which can be arbitrarily fast depending on the choice of $\alpha$. More precisely, we have $\Phi(x(t)) \min_{\mathcal H}\Phi = \mathcal O(t^{\frac{2}{3}\alpha})$. Then, we extend the results to the case of a general proper lowersemicontinuous convex function $\Phi : \mathcal H \rightarrow \mathbb R \cup \lbrace + \infty \rbrace$. This is based on the crucial property that the inertial dynamic with Hessian driven damping can be equivalently written as a firstorder system in time and space, allowing to extend it by simply replacing the gradient with the subdifferential. By explicitimplicit time discretization, this opens a gate to new $$ possibly more rapid $$ inertial algorithms, expanding the field of FISTA methods for convex structured optimization problems. Keywords: Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Citation: Download: [PDF] Entry Submitted: 01/26/2016 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  