Affordable Access

Optimizing 0/1 Loss for Perceptrons by Random Coordinate Descent

  • Li, Ling
  • Lin, Hsuan-Tien
Publication Date
Jan 01, 2007
Caltech Authors
External links


The 0/1 loss is an important cost function for perceptrons. Nevertheless it cannot be easily minimized by most existing perceptron learning algorithms. In this paper, we propose a family of random coordinate descent algorithms to directly minimize the 0/1 loss for perceptrons, and prove their convergence. Our algorithms are computationally efficient, and usually achieve the lowest 0/1 loss compared with other algorithms. Such advantages make them favorable for nonseparable real-world problems. Experiments show that our algorithms are especially useful for ensemble learning, and could achieve the lowest test error for many complex data sets when coupled with AdaBoost.

Report this publication


Seen <100 times