Affordable Access

Fast kernel classifier construction using orthogonal forward selection to minimise leave-one-out misclassification rate

Publication Date
  • Computer Science
  • Engineering
  • Mathematics


Fast Kernel Classifier Construction Using Orthogonal Forward Selection to Minimise Leave-One-Out Misclassification Rate X. Hong1, S. Chen2, and C.J. Harris2 1 Department of Cybernetics University of Reading, Reading, RG6 6AY, U.K. 2 School of Electronics and Computer Science University of Southampton Southampton SO17 1BJ, U.K. {sqc, cjh} Abstract. We propose a simple yet computationally efficient construc- tion algorithm for two-class kernel classifiers. In order to optimise clas- sifier’s generalisation capability, an orthogonal forward selection proce- dure is used to select kernels one by one by minimising the leave-one-out (LOO) misclassification rate directly. It is shown that the computation of the LOO misclassification rate is very efficient owing to orthogonali- sation. Examples are used to demonstrate that the proposed algorithm is a viable alternative to construct sparse two-class kernel classifiers in terms of performance and computational efficiency. 1 Introduction The two-class classification problems can be configured into a regression frame- work that solves a separating hyperplane for two classes, with the known class labels used as the desired output examples for model training in supervised learning. Models are usually identified according to some objective criteria. In- formation based criteria, such as the AIC [1], often include a penalty term to avoid an oversized model which may tend to overfit to the training data set. Par- simonious models are also preferable in engineering applications since a model’s computational complexity scales with its model complexity. Moreover a parsi- monious model is easier to interpret from the viewpoint of knowledge extrac- tion. Consequently a practical nonlinear modelling principle is to find the small- est model that generalises well. Model construction techniques that have been widely studied include the support vector machine (SVM), relevance vector ma- chine (RVM), and orth

There are no comments yet on this publication. Be the first to share your thoughts.