This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Benchmarking a Reduced Multivariate Polynomial Pattern Classifier
June 2004 (vol. 26 no. 6)
pp. 740-755

Abstract—A novel method using a reduced multivariate polynomial model has been developed for biometric decision fusion where simplicity and ease of use could be a concern. However, much to our surprise, the reduced model was found to have good classification accuracy for several commonly used data sets from the Web. In this paper, we extend the single output model to a multiple outputs model to handle multiple class problems. The method is particularly suitable for problems with small number of features and large number of examples. Basic component of this polynomial model boils down to construction of new pattern features which are sums of the original features and combination of these new and original features using power and product terms. A linear regularized least-squares predictor is then built using these constructed features. The number of constructed feature terms varies linearly with the order of the polynomial, instead of having a power law in the case of full multivariate polynomials. The method is simple as it amounts to only a few lines of Matlab code. We perform extensive experiments on this reduced model using 42 data sets. Our results compared remarkably well with best reported results of several commonly used algorithms from the literature. Both the classification accuracy and efficiency aspects are reported for this reduced model.

[1] A.K. Jain, R.P.W. Duin, and J. Mao, Statistical Pattern Recognition: A Review IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 4-37, Jan. 2000.
[2] R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification, second ed. New York: John Wiley&Sons, Inc., 2001.
[3] T. Poggio and F. Girosi, Networks for Approximation and Learning Proc. IEEE, vol. 78, pp. 1481-1497, 1990.
[4] C.M. Bishop, Neural Networks for Pattern Recognition. New York: Oxford Univ. Press Inc., 1995.
[5] Y. Shin and J. Ghosh, Ridge Polynomial Networks IEEE Trans. Neural Networks, vol. 6, no. 3, pp. 610-622, 1995.
[6] Y. Shin and J. Ghosh, The Pi-Sigma Network: An Efficient Higher-Order Neural Network for Pattern Classification and Function Approximation Proc. Int'l Joint Conf. Neural Networks, vol. 1, pp. 13-18, July 1991.
[7] W. Campbell, K. Torkkola, and S. Balakrishnan, Dimension Reduction Techniques for Training Polynomial Networks Proc. Int'l Conf. Machine Learning, June 2000.
[8] P. Chaudhuri, M.-C. Huang, W.-Y. Loh, and R. Yao, Piecewise-Polynomial Regression Trees Statistica Sinica, vol. 4, pp. 143-167, 1994.
[9] K.-A. Toh, W.-Y. Yau, and X. Jiang, A Reduced Multivariate Polynomials Model for Multi-Modal Biometrics and Classifiers Fusion IEEE Trans. Circuits and Systems for Video Technology, pending publication, 2004.
[10] C.L. Blake and C.J. Merz, UCI Repository of Machine Learning Databases Dept. of Information and Computer Sciences, Univ. of Calif., Irvine,http://www.ics.uci.edu/mlearnMLRepository.html , 1998.
[11] W. Lam, C.-K. Keung, and D. Liu, Discovering Useful Concept Prototypes for Classification Based on Filtering and Abstraction IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 8, pp. 1075-1090, Aug. 2002.
[12] T.-S. Lim, W.-Y. Loh, and Y.-S. Shil, A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-Three Old and New Classification Algorithms Machine Learning, vol. 40, no. 3, pp. 203-228, 2000.
[13] J. Neter, M.H. Kutner, C.J. Nachtsheim, and W. Wasserman, Applied Linear Regression Models, third ed. Chicago: Irwin, 1996.
[14] W.R. Wade, An Introduction to Analysis, second ed. Upper Saddle River, N.J.: Prentice Hall, 2000.
[15] B.E. Boser, I.M. Guyon, and V.N. Vapnik, A Training Algorithm for Optimal Margin Classifiers Proc. Fifth Ann. Workshop Computational Learning Theory, pp. 144-152, 1992.
[16] B. Schölkopf and A.J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. Cambridge, Mass.: MIT Press, 2002.
[17] S. Raudys, Evolution and Generalization of a Single Neurone: I. Single-Layer Perceptron as Seven Statistical Classifiers Neural Networks, vol. 11, pp. 283-296, 1998.
[18] S. Raudys, Evolution and Generalization of a Single Neurone: II. Complexity of Statistical Classifiers and Sample Size Considerations Neural Networks, vol. 11, pp. 297-313, 1998.
[19] L. I. Kuncheva, A Theoretical Study on Six Classifier Fusion Strategies IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 2, pp. 281-286, Feb. 2002.
[20] K. Hornik, M. Stinchcombe, and H. White, Multi-Layer Feedforward Networks Are Universal Approximators Neural Networks, vol. 2, no. 5, pp. 359-366, 1989.
[21] G. Cybenko, Approximations by Superpositions of a Sigmoidal Function Math. Cont. Signal&Systems, vol. 2, pp. 303-314, 1989.
[22] P. Brazdil, Statlog Datasets Inst. for Social Research at York Univ.,http://www.liacc.up.pt/ML/statlogdatasets. html , 1999.
[23] L. Pederson and S. Bull, StatLib Case Studies in Biometry Inst. for Social Research at York Univ.,http://lib.stat.cmu.edu/datasetscsb/, 1988.
[24] D. Precup and P.E. Utgoff, Classification Using$\Phi$-Machines and Constructive Function Approximation Machine Learning, 2003.
[25] A. Torn and A. Zilinskas, Global Optimization Lecture Notes in Computer Science, Berlin: Springer-Verlag, 1989.
[26] K.-A. Toh, Deterministic Global Optimization for FNN Training IEEE Trans. Systems, Man, and Cybernetics, Part B, vol. 33, no. 6, pp. 977-983, 2003.
[27] The MathWorks, Matlab and Simulinkhttp:/www.math works.com/, 2003.
[28] K.-A. Toh, Global Optimization by Monotonic Transformation Computational Optimization and Applications, vol. 23, pp. 77-99, Oct. 2002.

Index Terms:
Pattern classification, parameter estimation, pattern recognition, multivariate polynomials, and machine learning.
Citation:
Kar-Ann Toh, Quoc-Long Tran, Dipti Srinivasan, "Benchmarking a Reduced Multivariate Polynomial Pattern Classifier," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 6, pp. 740-755, June 2004, doi:10.1109/TPAMI.2004.3
Usage of this product signifies your acceptance of the Terms of Use.