Issue No. 09 - September (2006 vol. 28)
Kenneth E. Hild , IEEE
Deniz Erdogmus , IEEE
Jose C. Principe , IEEE
A classification system typically consists of both a feature extractor (preprocessor) and a classifier. These two components can be trained either independently or simultaneously. The former option has an implementation advantage since the extractor need only be trained once for use with any classifier, whereas the latter has an advantage since it can be used to minimize classification error directly. Certain criteria, such as Minimum Classification Error, are better suited for simultaneous training, whereas other criteria, such as Mutual Information, are amenable for training the feature extractor either independently or simultaneously. Herein, an information-theoretic criterion is introduced and is evaluated for training the extractor independently of the classifier. The proposed method uses nonparametric estimation of Renyi's entropy to train the extractor by maximizing an approximation of the mutual information between the class labels and the output of the feature extractor. The evaluations show that the proposed method, even though it uses independent training, performs at least as well as three feature extraction methods that train the extractor and classifier simultaneously.
Feature extraction, information theory, classification, nonparametric statistics.
K. E. Hild, D. Erdogmus, J. C. Principe and K. Torkkola, "Feature Extraction Using Information-Theoretic Learning," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 28, no. , pp. 1385-1392, 2006.