CSDL Home IEEE Transactions on Pattern Analysis & Machine Intelligence 2008 vol.30 Issue No.04 - April

Subscribe

Issue No.04 - April (2008 vol.30)

pp: 647-657

ABSTRACT

We present an algorithm which provides the one-dimensional subspace where the Bayeserror is minimized for the C class problem with homoscedastic Gaussian distributions. Ourmain result shows that the set of possible one-dimensional spaces v, for which the order ofthe projected class means is identical, defines a convex region with associated convex Bayeserror function g(v). This allows for the minimization of the error function using standardconvex optimization algorithms. Our algorithm is then extended to the minimization of theBayes error in the more general case of heteroscedastic distributions. This is done by meansof an appropriate kernel mapping function. This result is further extended to obtain the d-dimensional solution for any given d, by iteratively applying our algorithm to the null space ofthe (d — 1)-dimensional solution. We also show how this result can be used to improve uponthe outcomes provided by existing algorithms, and derive a low-computational cost, linearapproximation. Extensive experimental validations are provided to demonstrate the use ofthese algorithms in classification, data analysis and visualization.

INDEX TERMS

Linear discriminant analysis, feature extraction, Bayes optimal, convex optimization, pattern recognition, data mining, data visualization

CITATION

Onur C. Hamsici, Aleix M. Martinez, "Bayes Optimality in Linear Discriminant Analysis",

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol.30, no. 4, pp. 647-657, April 2008, doi:10.1109/TPAMI.2007.70717REFERENCES

- [4] R.A. Fisher, “The Statistical Utilization of Multiple Measurements,”
Annals of Eugenics, vol. 8, pp. 376-386, 1938.- [5] K. Fukunaga and J.M. Mantock, “Nonparametric Discriminant Analysis,”
IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 5, pp. 671-678, 1983.- [6] S. Geisser, “Discrimination, Allocatory, and Separatory Linear Aspects,”
Classification and Clustering, J. Van Ryzin, ed., pp. 301-330, 1977.- [7] P.E. Gill, W. Murray, and M.H. Wright,
Numerical Linear Algebra and Optimization, vol. 1. Addison-Wesley, 1991.- [8] B. Leibe and B. Schiele, “Analyzing Appearance and Contour Based Methods for Object Categorization,”
Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2003.- [15] S. Mika, G. Ratsch, J. Weston, B. Scholkopf, and K. Muller, “Fisher Discriminant Analysis with Kernels,”
Proc. IEEE Neural Networks for Signal Processing Workshop, 1999.- [16] D.J. Newman, S. Hettich, C.L. Blake, and C.J. Merz, “UCI Repository of Machine Learning Databases,” Dept. of Information and Computer Sciences, Univ. of California, Irvine,, http://www.ics.uci.edu/~mlearnMLRepository.html , 1998.
- [17] P.J. Phillips, P.J. Flynn, T. Scruggs, K.W. Bowyer, J. Chang, K. Hoffman, J. Marques, J. Min, and W. Worek, “Overview of the Face Recognition Grand Challenge,”
Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2005.- [18] C.R. Rao,
Linear Statistical Inference and Its Applications, second ed. Wiley Interscience, 2002.- [21] J. Yang, G.W. Xu, Q.F. Hong, H.M. Liebich, K. Lutz, R.M. Schmulling, and H.G. Wahl, “Discrimination of Type 2 Diabetic Patients from Healthy Controls by Using Metabonomics Method Based on Their Serum Fatty Acid Profiles,”
J. Chromatography B-Analytical Technologies in the Biomedical and Life Sciences, vol. 813, nos. 1-2, pp. 53-58, Dec.25, 2004. |