Affordable Access

Publisher Website

On the relation between discriminant analysis and mutual information for supervised linear feature extraction

Authors
Journal
Pattern Recognition
0031-3203
Publisher
Elsevier
Publication Date
Volume
37
Issue
5
Identifiers
DOI: 10.1016/j.patcog.2003.12.002
Keywords
  • Linear Feature Extraction
  • Linear Discriminant Analysis
  • Heteroscedastic Discriminant Analysis
  • Maximization Of Mutual Information
  • Bayes Error
  • Negentropy
Disciplines
  • Communication

Abstract

Abstract This paper provides a unifying view of three discriminant linear feature extraction methods: linear discriminant analysis, heteroscedastic discriminant analysis and maximization of mutual information. We propose a model-independent reformulation of the criteria related to these three methods that stresses their similarities and elucidates their differences. Based on assumptions for the probability distribution of the classification data, we obtain sufficient conditions under which two or more of the above criteria coincide. It is shown that these conditions also suffice for Bayes optimality of the criteria. Our approach results in an information-theoretic derivation of linear discriminant analysis and heteroscedastic discriminant analysis. Finally, regarding linear discriminant analysis, we discuss its relation to multidimensional independent component analysis and derive suboptimality bounds based on information theory.

There are no comments yet on this publication. Be the first to share your thoughts.