Convex Variational Formulations for Learning Problems

Abstract—In this article, we introduce new techniques to solve the nonlinear regression problem and the nonlinear classification problem. Our benchmarks suggest that our method for regression is significantly more effective when compared to classical methods and our method for classification is competitive. Our list of classical methods includes least squares, random forests, decision trees, boosted trees, nearest neighbors, logistic regression, SVMs and neural networks. These new techniques relie on convex variational formulations of the nonlinear regression and nonlinear classification problems. In the case of regression, we chose a function minimizing an energy functional plus the squared error of the predictions. In the case of classification, we chose a function minimizing an energy functional plus costs of misclassification. These convex variational formulations also provide information to perform dimensionality reduction and to study the dependencies between the variables of the problems. We also derive a notion of complexity for regression and classification problems. The method to find such minimizing functions turns out to be a simple quadratic optimization problem that can be solved efficiently. Here we present the methods in a way they can be easily understood by all practitioners without going into mathematical details.

Article

Download

View Convex Variational Formulations for Learning Problems