The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2006 vol.28)
pp: 1385-1392
ABSTRACT
A classification system typically consists of both a feature extractor (preprocessor) and a classifier. These two components can be trained either independently or simultaneously. The former option has an implementation advantage since the extractor need only be trained once for use with any classifier, whereas the latter has an advantage since it can be used to minimize classification error directly. Certain criteria, such as Minimum Classification Error, are better suited for simultaneous training, whereas other criteria, such as Mutual Information, are amenable for training the feature extractor either independently or simultaneously. Herein, an information-theoretic criterion is introduced and is evaluated for training the extractor independently of the classifier. The proposed method uses nonparametric estimation of Renyi's entropy to train the extractor by maximizing an approximation of the mutual information between the class labels and the output of the feature extractor. The evaluations show that the proposed method, even though it uses independent training, performs at least as well as three feature extraction methods that train the extractor and classifier simultaneously.
INDEX TERMS
Feature extraction, information theory, classification, nonparametric statistics.
CITATION
Kenneth E. Hild, Deniz Erdogmus, Kari Torkkola, Jose C. Principe, "Feature Extraction Using Information-Theoretic Learning", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.28, no. 9, pp. 1385-1392, September 2006, doi:10.1109/TPAMI.2006.186
6 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool