Proceedings chapter
Open access

Margin and radius based multiple Kernel Learning

Published inMachine Learning and Knowledge Discovery in Databases, Editors Buntine, Wray, Grobelnik, Marko, Mladeni, Dunja & Shawe-Taylor, John, p. 330-343
Presented at Bled (Slovenia), September 7-11 2009
PublisherBerlin, Heidelberg : Springer
  • Lecture Notes in Computer Science; 5781
Publication date2009

A serious drawback of kernel methods, and Support Vector Machines (SVM) in particular, is the difficulty in choosing a suitable kernel function for a given dataset. One of the approaches proposed to address this problem is Multiple Kernel Learning (MKL) in which several kernels are combined adaptively for a given dataset. Many of the existing MKL methods use the SVM objective function and try to find a linear combination of basic kernels such that the separating margin between the classes is maximized. However, these methods ignore the fact that the theoretical error bound depends not only on the margin, but also on the radius of the smallest sphere that contains all the training instances. We present a novel MKL algorithm that optimizes the error bound taking account of both the margin and the radius. The empirical results show that the proposed method compares favorably with other state-of-the-art MKL methods.

  • Learning Kernel Combination
  • Support Vector Machines
  • Convex optimization
Citation (ISO format)
DO, Thi Thanh Huyen et al. Margin and radius based multiple Kernel Learning. In: Machine Learning and Knowledge Discovery in Databases. Bled (Slovenia). Berlin, Heidelberg : Springer, 2009. p. 330–343. (Lecture Notes in Computer Science) doi: 10.1007/978-3-642-04180-8_39
Updates (1)
Proceedings chapter

Technical informations

Creation06/18/2010 9:22:00 AM
First validation06/18/2010 9:22:00 AM
Update time03/14/2023 3:30:07 PM
Status update03/14/2023 3:30:07 PM
Last indexation10/18/2023 9:15:25 AM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack