Affordable Access

Publisher Website

Kernel sparse representation for time series classification

Authors
Journal
Information Sciences
0020-0255
Publisher
Elsevier
Identifiers
DOI: 10.1016/j.ins.2014.08.066
Keywords
  • Sparse Representation
  • Dictionary Learning
  • Time Series Classification
  • Kernel Method
Disciplines
  • Computer Science

Abstract

Abstract In recent years there has been growing interests in mining time series data. To overcome the adverse influence of time shift, a number of effective elastic matching approaches such as dynamic time warp (DTW), edit distance with real penalty (ERP), and time warp edit distance (TWED) have been developed based on the nearest neighbor classification (NNC) framework, where the distance d(x, Ci) between a test sample x and one specific class Ci is simply defined as the minimum distance between x and the training samples in this class. In many applications, the sparse representation classifier (SRC) was applied by defining d(x, Ci) as the distance of x to a linear combination of the samples in class Ci, and it usually outperformed NNC in terms of classification accuracy. However, due to time shift, a linear combination of several time series is generally meaningless and may result in poor classification performance. In this paper, a family of Gaussian elastic matching kernels was introduced to deal with the problems of time shift and nonlinear representation. In this way, a linear combination of time series can be conducted in the implicit kernel space. Then a kernel sparse representation learning framework for time series classification was proposed. To improve computational efficiency and classification performance, both unsupervised and supervised dictionary learning techniques were developed by extending KSVD and label consistent KSVD algorithms. Experimental results showed that the proposed methods generally outperformed state-of-the-arts methods in terms of classification accuracy.

There are no comments yet on this publication. Be the first to share your thoughts.