A binary linear classification method, CGS method, was recently
proposed by Gotoh and Takeda. The classification model was developed
by introducing a risk measure known as the conditional value-at-risk
(beta-CVaR). It is found to be equivalent to Extended nu-SVC of
Perez-Cruz et al., and especially in the convex case, equivalent to
nu-SVC of Scholkopf et al.
The aim of this paper is to propose beta-SVM by extending CGS
classification model, investigate the relation between beta-SVM
and (Extended) nu-SVMs, and discuss theoretical aspects, mainly
generalization performance, of beta-SVM. The formula of a
generalization error bound includes beta-CVaR or a related quantity.
It implies that the minimum beta-CVaR obtained via beta-SVM plays
an important role to control the generalization error of beta-SVM.
The viewpoint from CVaR minimization is useful to make sure of the
validity of not only beta-SVM but also nu-SVM. We furthermore show
a numerical example of nonconvex beta-SVR, an extension of nu-SVR.