
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
R.T. Peres, C.E. Pedreira, "Generalized Risk Zone: Selecting Observations for Classification," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 7, pp. 13311337, July, 2009.  
BibTex  x  
@article{ 10.1109/TPAMI.2008.269, author = {R.T. Peres and C.E. Pedreira}, title = {Generalized Risk Zone: Selecting Observations for Classification}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {31}, number = {7}, issn = {01628828}, year = {2009}, pages = {13311337}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2008.269}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Generalized Risk Zone: Selecting Observations for Classification IS  7 SN  01628828 SP1331 EP1337 EPD  13311337 A1  R.T. Peres, A1  C.E. Pedreira, PY  2009 KW  Classification KW  neural networks KW  observations selection KW  risk zone KW  support vector machine. VL  31 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] C.E. Pedreira, “Learning Vector Quantization with Training Data Selection,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 1, pp.157162, Jan. 2006.
[2] J.C. Principe, D. Xu, and J. Fisher, “Information Theoretic Learning,” Unsupervised Adaptive Filtering, S. Haykin, ed., Wiley, 2000.
[3] R.O. Duda, P.E. Hart, and G. Stork, Pattern Recognition, second ed. Wiley, 2001.
[4] R. Jensen, “An Information Theoretic Approach to Machine Learning,” Doctor scientiarum dissertation, Faculty of Science, Dept. of Physics, Univ. of Tromso, 2005.
[5] R. Detrano, A. Janosi, W. Steinbrunn, M. Pfisterer, J. Schmid, S. Sandhu, K. Guppy, S. Lee, and V. Froelicher, “International Application of a New Probability Algorithm for the Diagnosis of Coronary Artery Disease,” Am. J. Cardiology, pp.304310, 1989.
[6] V.N. Vapnik, Statistical Learning Theory. Wiley, 1998.
[7] C.J.C. Burges, “A Tutorial on Support Vector Machines for Pattern Recognition,” Knowledge Discovery and Data Mining, vol. 2, no. 2, pp.121167, 1998.
[8] M. Plutowski and H. White, “Selecting Concise Training Sets from Clean Data,” IEEE Trans. Neural Networks, vol. 4, no. 2, pp.305318, Mar. 1993.
[9] J.N. Hwang, J.J. Choi, S. Oh, and R.J. Marks II, “QueryBased Learning Applied to Partially Trained MultiLayer Perceptrons,” IEEE Trans. Neural Networks, vol. 2, no. 1, pp.131136, Jan. 1991.
[10] J.J. Faraway, “Sequential Design for the Nonparametric Regression of Curves and Surfaces,” Proc. 22nd Symp. Interface between Computing Science and Statistics, pp.104110, 1990.
[11] T. Kohonen, SelfOrganizing Maps, third ed. Springer, 2001.
[12] C.E. Pedreira, L. Macrini, and E.S. Costa, “Input and Data Selection Applied to Heart Disease Diagnosis,” Proc. IEEEInt'l Neural Network Network Soc.European Neural Network Soc. Int'l Joint Conf. Neural Networks, 2005.
[13] P. Mitra and S.K. Pal, “A Probabilistic Active Support Vector Learning Algorithm,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 3, pp.413418, Mar. 2004.
[14] M. Li and I.K. Sethi, “ConfidenceBased Active Learning,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 8, pp.12511261, Aug. 2006.
[15] D. Xu, “Energy, Entropy and Information Potential for Neural Computation,” PhD thesis, Univ. of Florida, 1999.
[16] S. Vinga and J. Almeida, “Rényi Continuous Entropy of DNA Sequences,” J. Theorectical Biology, vol. 231, no. 3, pp.377388, 2004.
[17] N. Bouguila and D. Ziou, “A Hybrid SEM Algorithm for HighDimensional Unsupervised Learning Using a Finite Generalized Dirichlet Mixture,” IEEE Trans. Image Processing, vol. 15, no. 9, pp.26572668, 2006.
[18] K. Huang, H. Yang, I. King, M.R. Lyu, and L. Chan, “The Minimum Error Minimax Probability Machine,” J. Machine Learning Research, vol. 5, pp.12531286, 2004.
[19] G. Tutz and H. Binder, “Localized Classification,” Statistics and Computing, vol. 15, pp.155166, 2005.
[20] B.W. Silverman, Density Estimation for Statistics and Data Analysis. Chapman and Hall, 1986.
[21] G.T. Toussaint, “Geometric Proximity Graphs for Improving Nearest Neighbor Methods in InstanceBased Learning and Data Mining”, Int'l J. Computational Geometry and Applications, vol. 15, no. 2, pp.101150, 2005.
[22] D.W. Aha, D. Kibler, and M. Albert, “InstanceBased Learning Algorithms,” Machine Learning, vol. 6, pp.3766, 1991.
[23] D.R. Wilson and T.R. Martinez, “Reduction Techniques for InstanceBased Learning Algorithms,” Machine Learning, vol. 38, pp.257286, 2000.
[24] Z.Q. Hong and J.Y. Yang, “Optimal Discriminant Plane for a Small Number of Samples and Design Method of Classifier on the Plane,” Pattern Recognition, vol. 24, no. 4, pp.317324, 1991.
[25] L. Breiman, J.H. Friedman, A.R. Olshen, and J.C. Stone, Classification and Regression Trees, 1984.