
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Steven Salzberg, Arthur L. Delcher, David Heath, Simon Kasif, "BestCase Results for NearestNeighbor Learning," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, no. 6, pp. 599608, June, 1995.  
BibTex  x  
@article{ 10.1109/34.387506, author = {Steven Salzberg and Arthur L. Delcher and David Heath and Simon Kasif}, title = {BestCase Results for NearestNeighbor Learning}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {17}, number = {6}, issn = {01628828}, year = {1995}, pages = {599608}, doi = {http://doi.ieeecomputersociety.org/10.1109/34.387506}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  BestCase Results for NearestNeighbor Learning IS  6 SN  01628828 SP599 EP608 EPD  599608 A1  Steven Salzberg, A1  Arthur L. Delcher, A1  David Heath, A1  Simon Kasif, PY  1995 KW  Machine learning KW  nearestneighbor KW  geometric concepts. VL  17 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] B.V. Dasarathy,NN (NN) Norms: NN Pattern Classification Techniques.Los Alamitos, Calif.: IEEE CS Press, 1991.
[2] D.W. Aha, D. Kibler, and M.K. Albert, “InstanceBased Learning Algorithms,” Machine Learning, vol. 6, pp. 3766, 1991.
[3] S. Salzberg,Learning with Nested Generalized Exemplars,Norwell, Mass.: Kluwer Academic Publishers, 1990.
[4] S. Cost and S. Salzberg, "A Weighted Nearest Neighbor Algorithm for Learning with Symbolic Features," Machine Learning, Vol. 10, No. 1, Jan. 1993, pp. 5778.
[5] T.M. Cover and P. Hart, "Nearest Neighbor Pattern Classification," Proc. IEEE Trans. Information Theory, pp. 2127, 1967.
[6] P.A. Devijver,“An overview of asymptotic properties of NN rules,” Pattern Recognition in Practice.New York: Elsevier Science Publishers B.V., pp. 343350, 1980.
[7] T. Cover,“Geometric and statistical properties of systems of linear inequalities,” IEEE Trans. Computers, vol. 14, pp. 326334, 1965.
[8] M. Minsky and S. Papert,Perceptrons.Cambridge, Mass.: MIT Press, 1969.
[9] S. Salzberg,A. Delcher,D. Heath,, and S. Kasif,“Learning with a helpful teacher,” Tech. rep. 90/14, Dept. of Computer Science, Johns Hopkins Univ., 1990 (revised 1992).
[10] S. Salzberg,A. Delcher,D. Heath,, and S. Kasif,“Learning with a helpful teacher,” Proc. 12th Int’l Joint Conf. Artificial Intelligence, pp. 705711,Sydney, Australia, Aug. 1991, Morgan Kaufmann.
[11] E.F. Fix and J. Hodges,“Discriminatory analysis: Small sample performance,” Tech. rep. project 2149004, report no. 11, USAF School of Aviation Medicine, Randolph Field, Tex., Aug. 1952.
[12] L. Devroye, “Automatic Pattern Recognition: A Study of the Probability of Error,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 10, no. 4, pp. 530543, 1988.
[13] S. Salzberg, “A Nearest Hyperrectangle Learning Method,” Machine Learning, vol. 6, pp. 251276, 1991.
[14] D. Angluin,“Learning regular sets from queries and counterexamples,” Information and Computation, vol. 75, pp. 87106, 1987.
[15] S. Goldman and M. Kearns,“On the complexity of teaching,” Proc. Fourth Ann. Workshop Computational Learning Theory, pp. 303314,Santa Cruz, Calif., Aug. 1991, Morgan Kaufmann.
[16] K. Romanik and S. Salzberg,“Testing orthogonal shapes,” Computational Geometry: Theory and Applications, vol. 5, pp. 3349, 1995.
[17] P. Hart,“The condensed NN rule,” IEEE Trans. Information Theory, vol. 14, no. 3, May 1968.
[18] C. Swonger,“Sample set condensation for a condensed NN decision rule for pattern recognition,” Frontiers of Pattern Recognition, S. Watanabe, ed. New York: Academic Press, pp. 511519, 1972.
[19] D. Wilson,“Asymptotic properties of NN rules using edited data,” IEEE Trans. Systems, Man, and Cybernetics, vol. 2, no. 3, pp. 408421, July 1972.
[20] C.L. Chang,“Finding prototypes for NN classifiers,” IEEE Trans. Computers, vol. 23, no. 11, pp. 1,1791,184, Nov. 1974.
[21] G.L. Ritter, H.B. Woodruff, and S.R. Lowry, “An Algorithm for a Selective Nearest Neighbor Decision Rule,” IEEE Trans. Information Theory, vol. 21, no. 6, pp. 665669, 1975.
[22] G. Toussaint,B. Bhattacharya,, and R. Poulsen,“The application of Voronoi diagrams to nonparametric decision rules,” Computer Science and Statistics: Proc. 16th Symp. on the Interface, L. Billard, ed., pp. 97108,New York, Elsevier Science Publishers, 1984.
[23] B. Bhattacharya,R. Poulsen,, and G. Toussaint,“Application of proximity graphs to editing NN decision rules,” Tech. rep. SOCS 92.19, School of Computer Science, McGill University, Montreal, Dec. 1992.
[24] F.P. Preparata and M.I. Shamos, Computational Geometry. SpringerVerlag, 1985.
[25] B. Baker,E. Grosse,, and C. Rafferty,“Nonobtuse triangulation of polygons,” Discrete Computational Geometry, vol. 3, pp. 147168, 1988.
[26] M. Bern and D. Eppstein,“Polynomialsize nonobtuse triangulation of polygons,” Proc. Seventh Ann. Symp. Computational Geometry, pp. 342350,New York, 1991.
[27] M. Bern,S. Mitchell,, and J. Ruppert,“Linearsize nonobtuse triangulation of polygons,” Proc. 10th Ann. ACM Symp. Computational Geometry, pp. 221230,Stony Brook, N.Y., 1994.
[28] D. Heath,A Geometric Framework for Machine Learning, PhD thesis, Johns Hopkins Univ., 1992.
[29] M. Bern,H. Edelsbrunner,D. Eppstein,S. Mitchell,, and S. Tan,“Edge insertion for optimal triangulations,” LATIN’92: First Latin American Symp. Theoretical Informatics, I. Simon, ed., pp. 4660,Berlin, 1992, SpringerVerlag.
[30] J.R. Quinlan, C4.5: Programs for Machine Learning,San Mateo, Calif.: Morgan Kaufman, 1992.