-

 

 

 




Optimization Online





 

Incremental Accelerated Gradient Methods for SVM Classification: Study of the Constrained Approach

Nicolas Couellan (nicolas.couellan***at***math.univ-toulouse.fr)
Sophie Jan (sophie.jan***at***math.univ-toulouse.fr)

Abstract: We investigate constrained first order techniques for training Support Vector Machines (SVM) for online classification tasks. The methods exploit the structure of the SVM training problem and combine ideas of incremental gradient technique, gradient acceleration and successive simple calculations of Lagrange multipliers. Both primal and dual formulations are studied and compared. Experiments show that the constrained incremental algorithms working in the dual space achieve the best trade-off between prediction accuracy and training time. We perform comparisons with an unconstrained large scale learning algorithm (Pegasos stochastic gradient) to emphasize that our choice can remain competitive for large scale learning due to the very special structure of the training problem.

Keywords: Support Vector Machines, kernel technique, machine learning, incremental gradient method, accelerated gradient, constrained gradient, nonlinear programming

Category 1: Nonlinear Optimization (Constrained Nonlinear Optimization )

Category 2: Applications -- Science and Engineering (Data-Mining )

Citation: Accepted in Computational Management Science, available online DOI: 10.1007/s10287-013-0186-2, 2013

Download:

Entry Submitted: 05/02/2013
Entry Accepted: 05/02/2013
Entry Last Modified: 08/30/2013

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society