Linearly Convergent Away-Step Conditional Gradient for Non-strongly Convex Functions

We consider the problem of minimizing a function, which is the sum of a linear function and a composition of a strongly convex function with a linear transformation, over a compact polyhedral set. Jaggi and Lacoste-Julien [14] showed that the conditional gradient method with away steps employed on the aforementioned problem without the additional linear term has linear rate of convergence, depending on the so-called pyramidal width of the feasible set. We revisit this result and provide a variant of the algorithm and an analysis that is based on simple duality arguments, as well as corresponding error bounds. This new analysis (a) enables the incorporation of the additional linear term, (b) does not require a linear-oracle that outputs an extreme point of the linear mapping of the feasible set and (c) depends on a new constant, termed ``the vertex-facet distance constant", which is explicitly expressed in terms of the problem's parameters and the geometry of the feasible set. This constant replaces the pyramidal width, which is difficult to evaluate.

Citation

Technion - Israel Institute of Technology (April, 2015)

Article

Download

View Linearly Convergent Away-Step Conditional Gradient for Non-strongly Convex Functions