### Convex Tuning of the Soft Margin Parameter

**Authors:**

De Bie, Tijl

Lanckriet, Gert R. G.

Cristianini, Nello
**Technical Report Identifier:** CSD-03-1289
**November 2003**

CSD-03-1289.pdf

CSD-03-1289.ps

**Abstract:** In order to deal with known limitations of the hard margin support vector machine (SVM) for binary classification -- such as overfitting and the fact that some data sets are not linearly separable --, a soft margin approach has been proposed in literature. The soft margin SVM allows training data to be misclassified to a certain extent, by introducing slack variables and penalizing the cost function with an error term, i.e., the 1-norm or 2-norm of the corresponding slack vector. A regularization parameter *C* trades off the importance of maximizing the margin versus minimizing the error. While the 2-norm soft margin algorithm itself is well understood, and a generalization bound is known, no computationally tractable method for tuning the soft margin parameter *C* has been proposed so far. In this report we present a convex way to optimize *C* for the 2-norm soft margin SVM, by maximizing this generalization bound. The resulting problem is a quadratically constrained quadratic programming (QCQP) problem, which can be solved in polynomial time *O*(*l*^3) with *l* the number of training samples.