Affordable Access

Publisher Website

An efficient multiple-kernel learning for pattern classification

Authors
Publisher
Elsevier Ltd
Volume
40
Issue
9
Identifiers
DOI: 10.1016/j.eswa.2012.12.057
Keywords
  • Support Vector Machines
  • Multiple-Kernel Learning
  • Semidefinite Programming
Disciplines
  • Computer Science

Abstract

Abstract Support vector machines (SVMs) have been broadly applied to classification problems. However, a successful application of SVMs depends heavily on the determination of the right type and suitable hyperparameter settings of kernel functions. Recently, multiple-kernel learning (MKL) algorithms have been developed to deal with these issues by combining different kernels together. The weight with each kernel in the combination is obtained through learning. Lanckriet et al. proposed a way of deriving the weights by transforming the learning into a semidefinite programming (SDP) problem with a transduction setting. However, the amount of time and space required by this method is demanding. In this paper, we reformulate the SDP problem with an induction setting and incorporate two strategies to reduce the search complexity of the learning process, based on the comments discussed in the Lanckriet et al. paper. The primal and dual forms of SDP are derived. A discussion on computation complexity is given. Experimental results obtained from synthetic and benchmark datasets show that the proposed method runs efficiently in multiple-kernel learning.

There are no comments yet on this publication. Be the first to share your thoughts.