The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2005 vol.27)
pp: 1417-1429
ABSTRACT
We present the Nearest Subclass Classifier (NSC), which is a classification algorithm that unifies the flexibility of the nearest neighbor classifier with the robustness of the nearest mean classifier. The algorithm is based on the Maximum Variance Cluster algorithm and, as such, it belongs to the class of prototype-based classifiers. The variance constraint parameter of the cluster algorithm serves to regularize the classifier, that is, to prevent overfitting. With a low variance constraint value, the classifier turns into the nearest neighbor classifier and, with a high variance parameter, it becomes the nearest mean classifier with the respective properties. In other words, the number of prototypes ranges from the whole training set to only one per class. In the experiments, we compared the NSC with regard to its performance and data set compression ratio to several other prototype-based methods. On several data sets, the NSC performed similarly to the k-nearest neighbor classifier, which is a well-established classifier in many domains. Also concerning storage requirements and classification speed, the NSC has favorable properties, so it gives a good compromise between classification performance and efficiency.
INDEX TERMS
Index Terms- Classification, regularization, cross-validation, prototype selection.
CITATION
Cor J. Veenman, Marcel J.T. Reinders, "The Nearest Subclass Classifier: A Compromise between the Nearest Mean and Nearest Neighbor Classifier", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.27, no. 9, pp. 1417-1429, September 2005, doi:10.1109/TPAMI.2005.187
22 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool