< Back to previous page

Publication

Robust support vector machines for classification with nonconvex and smooth losses

Journal Contribution - Journal Article

This letter addresses the robustness problem when learning a large margin classifier in the presence of label noise. In our study, we achieve this purpose by proposing robustified large margin support vector machines. The robustness of the proposed robust support vector classifiers (RSVC), which is interpreted from a weighted viewpoint in this work, is due to the use of nonconvex classification losses. Besides the robustness, we also show that the proposed RSCV is simultaneously smooth, which again benefits from using smooth classification losses. The idea of proposing RSVC comes from M-estimation in statistics since the proposed robust and smooth classification losses can be taken as one-sided cost functions in robust statistics. Its Fisher consistency property and generalization ability are also investigated. Besides the robustness and smoothness, another nice property of RSVC lies in the fact that its solution can be obtained by solving weighted squared hinge loss-based support vector machine problems iteratively. We further show that in each iteration, it is a quadratic programming problem in its dual space and can be solved by using state-of-the-art methods. We thus propose an iteratively reweighted type algorithm and provide a constructive proof of its convergence to a stationary point. Effectiveness of the proposed classifiers is verified on both artificial and real data sets.
Journal: Neural Computation
ISSN: 0899-7667
Issue: 6
Volume: 28
Pages: 1217 - 1247
Publication year:2016
BOF-keylabel:yes
IOF-keylabel:yes
BOF-publication weight:1
CSS-citation score:1
Authors:International
Authors from:Higher Education
Accessibility:Open