
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Carlos E. Pedreira, "Learning Vector Quantization with Training Data Selection," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 1, pp. 157162, January, 2006.  
BibTex  x  
@article{ 10.1109/TPAMI.2006.14, author = {Carlos E. Pedreira}, title = {Learning Vector Quantization with Training Data Selection}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {28}, number = {1}, issn = {01628828}, year = {2006}, pages = {157162}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2006.14}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Learning Vector Quantization with Training Data Selection IS  1 SN  01628828 SP157 EP162 EPD  157162 A1  Carlos E. Pedreira, PY  2006 KW  Index Terms Learning vector quantization LVQ KW  pattern classification KW  clustering KW  data selection KW  neural networks. VL  28 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] A. Gersho, “Asymptotically Optimal Block Quantization,” IEEE Trans. Information Theory, vol. 25, pp. 373380, 1979.
[2] P. Zador, “Asymptotic Quantization Error of Continuous Signals and the Quantization Dimension,” IEEE Trans. Information Theory, vol. 28, pp. 139149, 1982.
[3] R.A. Jonson and D.W. Wichern, Applied Multivariable Statistical Analysis. Prentice Hall, 1998.
[4] T. Kohonen, “SelfOrganized Formation of Topologically Correct Feature Maps,” Biological Cybernetics, vol. 43, p. 59, 1982.
[5] T. Kohonen, SelfOrganizing Maps, third ed. Springer, 2001.
[6] T. Kohonen, “An Introduction to Neural Computing,” Neural Networks 1, pp. 316, 1988.
[7] R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification, second ed. John Wiley, 2001.
[8] L. Bottou, “Stochastic Learning,” Lecture Notes in Artificial Intelligence, vol. 3176, pp. 146168, 2004.
[9] T. Kohonen, G. Barna, and R. Chrisley, “Statistical Pattern Recognition with Neural Networks: Benchmarking Studies,” Proc. IEEE Int'l Conf. Neural Networks, 1988.
[10] A.S. Sato and K. Yamada, “Generalized Learning Vector Quantization,” Advances in Neural Information Processing Systems, G. Tesauro, D. Touretzky, and T. Leon, eds., vol. 7, pp. 423429, 1995.
[11] B. Hammer and T. Villmann, “Generalized Relevance Learning Vector Quantization,” Neural Networks 15, pp. 10591068, 2002.
[12] A.K. Qin and P.N. Suganthan, “Initialization Insensitive LVQ Algorithm Based on CostFunction Adaptation,” Pattern Recognition, 2005.
[13] M.T. VakilBaghmisheh and N. Pavesi, “Premature Clustering Phenomenon and New Training Algorithms for LVQ,” Pattern Recognition, vol. 36, no. 8, pp. 19011912, 2003.
[14] S. Seo and K. Obermayer, “Soft Learning Vector Quantization,” Neural Computation, vol. 15, no. 7, pp. 15891604, 2003.
[15] N.H. Harmon, Modern Factor Analysis. Univ. of Chicago Press, 1967.
[16] E. Oja, “Principal Component Analysis,” The Handbook of Brain Theory and Neural Networks, M. Arbib, ed. pp. 753756, MIT Press, 1995.
[17] M. Girolami, SelfOrganising Neural Networks: Independent Component Analysis and Blind Source Separation. Springer, 1999.
[18] L. Breiman, J.H. Friedman, R.A. Olshen, and C. Stone, Classification and Regression Trees, Belmont, Calif.: Wadsworth, 1984.
[19] R. Setiono and H. Liu, “Neural Network Feature Selector,” IEEE Trans. Neural Networks, vol. 8, pp. 654661, 1997.
[20] R. Battiti, “Using Mutual Information for Selecting Features in Supervised Neural Net Learning,” IEEE Trans. Neural Networks, vol. 5, pp. 537550, 1994.
[21] N. Kwak and C. Choi, “Input Feature Selection for Classification Problems,” IEEE Trans. Neural Networks, vol. 13, no. 1, pp. 143159, 2002.
[22] I. Gath and A.B. Geva, “Unsupervised Optimal Fuzzy Clustering,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 11, no. 7, pp. 773781, July 1989.
[23] V. Vapnik, The Nature of Statistical Learning Theory. SpringerVerlag, 1995.
[24] K. Crammer, R. GiladBachrach, and A. Tishby, “Marging Analysis of the LVQ Algorithm,” Proc. 15th Ann. Conf. Neural Information Processing Systems, 2002.
[25] S. Haykin, Neural Networks: A Comprehensive Foundation. PrenticeHall 1998.
[26] R.A. Fisher, “The Use of Multiple Measurements in Taxonomic Problems,” Ann. Eugenics, vol. 7, part II, pp. 179188, 1936.
[27] R. Detrano, A. Janosi, W. Steinbrunn, M. Pfisterer, J. Schmid, S. Sandhu, K. Guppy, S. Lee, and V. Froelicher, “International Application of a New Probability Algorithm for the Diagnosis of Coronary Artery Disease,” Am. J. Cardiology, pp. 304310, 1989.
[28] W.H. Wolberg and O.L. Mangasarian, “Multisurface Method of Pattern Separation for Medical Diagnosis Applied to Breast Cytology,” Proc. Nat'l Academy of Sciences, USA, vol. 87, pp. 91939196, 1990.
[29] O.L. Mangasarian, “Multisurface Method of Pattern Separation,” IEEE Trans. Information Theory, vol. 14, no. 6, pp. 801807, Nov. 1968.
[30] F. Berzal, J.C. Cubero, F. Cuenca, and M. MartínBautista, “On the Quest for EasytoUnderstand Splitting Rules,” Data and Knowledge Eng., vol. 44, no. 1, pp. 3148, 2003.