A Direct Method for Building Sparse Kernel Learning Algorithms




Many Kernel Learning Algorithms(KLA), including Support Vector Machine (SVM), result in a Kernel Machine (KM), such as a kernel classifier, whose key component is a weight vector in a feature space implicitly introduced by a positive definite kernel function. This weight vector is usually obtained by solving a convex optimization problem. Based on this fact we present a direct method to build Sparse Kernel Learning Algorithms (SKLA) by adding one more constraint to the original convex optimization problem, such that the sparseness of the resulting KM is explicitly controlled while at the same time the performance of the resulting KM can be kept as high as possible. A gradient based approach is provided to solve this modified optimization problem. Applying this method to the SVM results in a concrete algorithm for building Sparse Large Margin Classifiers (SLMC). Further analysis of the SLMC algorithm indicates that it essentially finds a discriminating subspace that can be spanned by a small number of vectors, and in this subspace, the different classes of data are linearly well separated. Experimental results over several classification benchmarks demonstrate the effectiveness of our approach.

Author(s): Wu, M. and Schölkopf, B. and BakIr, G.
Journal: Journal of Machine Learning Research
Volume: 7
Pages: 603-624
Year: 2006
Month: April
Day: 0

Department(s): Empirical Inference
Bibtex Type: Article (article)

Digital: 0
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF


  title = {A Direct Method for Building Sparse Kernel Learning Algorithms},
  author = {Wu, M. and Sch{\"o}lkopf, B. and BakIr, G.},
  journal = {Journal of Machine Learning Research},
  volume = {7},
  pages = {603-624},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = apr,
  year = {2006},
  month_numeric = {4}