Affordable Access

Optimizing 0/1 Loss for Perceptrons by Random Coordinate Descent

Authors
Publication Date
Source
Caltech Authors
External links

Abstract

The 0/1 loss is an important cost function for perceptrons. Nevertheless it cannot be easily minimized by most existing perceptron learning algorithms. In this paper, we propose a family of random coordinate descent algorithms to directly minimize the 0/1 loss for perceptrons, and prove their convergence. Our algorithms are computationally efficient, and usually achieve the lowest 0/1 loss compared with other algorithms. Such advantages make them favorable for nonseparable real-world problems. Experiments show that our algorithms are especially useful for ensemble learning, and could achieve the lowest test error for many complex data sets when coupled with AdaBoost.

There are no comments yet on this publication. Be the first to share your thoughts.

Statistics

Seen <100 times
0 Comments