< Back to previous page

Publication

A Review on Consistency and Robustness Properties of Support Vector Machines for Heavy-Tailed Distributions

Journal Contribution - Journal Article

Support vector machines (SVMs) belong to the class of modern statistical machine learning techniques and can be described as M-estimatorswith aHilbert norm regularization term for functions. SVMs are consistent and robust for classification and regression purposes if based on a Lipschitz continuous loss and a bounded continuous kernel with a dense reproducing kernel Hilbert space. For regression, one of the conditions used is that the output variable Y has a finite first absolute moment. This assumption, however, excludes heavy-tailed distributions. Recently, the applicability of SVMs was enlarged to these distributions by considering shifted loss functions. In this review paper, we briefly describe the approach of SVMs based on shifted loss functions and list some properties of such SVMs. Then, we prove that SVMs based on a bounded continuous kernel and on a convex and Lipschitz continuous, but not necessarily differentiable, shifted loss function have a bounded Bouligand influence function for all distributions, even for heavy-tailed distributions including extreme value distributions and Cauchy distributions. SVMs are thus robust in this sense. Our result covers the important loss functions epsilon-insensitive for regression and pinball for quantile regression, which were not covered by earlier results on the influence function. We demonstrate the usefulness of SVMs even for heavy-tailed distributions by applying SVMs to a simulated data set with Cauchy errors and to a data set of large fire insurance claims of Copenhagen Re.
Journal: Advances in Data Analysis and Classification
ISSN: 1862-5347
Issue: 4
Volume: 2-3
Pages: 199-220
Publication year:2010
Keywords:Regularized empirical risk minimization, Support vector machines, Consistency, Robustness, Bouligand influence function, Heavy tails
  • Scopus Id: 77955850781