
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
L. Saitta, F. Bergadano, "Pattern Recognition and Valiant's Learning Framework," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 15, no. 2, pp. 145155, February, 1993.  
BibTex  x  
@article{ 10.1109/34.192486, author = {L. Saitta and F. Bergadano}, title = {Pattern Recognition and Valiant's Learning Framework}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {15}, number = {2}, issn = {01628828}, year = {1993}, pages = {145155}, doi = {http://doi.ieeecomputersociety.org/10.1109/34.192486}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Pattern Recognition and Valiant's Learning Framework IS  2 SN  01628828 SP145 EP155 EPD  145155 A1  L. Saitta, A1  F. Bergadano, PY  1993 KW  Valiant's learning framework; computational learning approach; concept descriptions; probability; training sample; pattern recognition; growth function; Boolean formulas; Boolean functions; learning (artificial intelligence); pattern recognition; probability VL  15 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
The computational learning approach shows that the concept descriptions acquired from examples are approximately correct with a degree of probability that grows with the size of the training sample. The same problem has also been widely investigated in the field of pattern recognition under a variety of problem settings. Some of the results obtained in both fields are surveyed and compared, and the limits of their applicability are analyzed. Moreover, new and tighter bounds for the growth function of some classes of Boolean formulas are presented.
[1] M. Abramowitz and I. A. Stegun,Handbook of Mathematical Functions. New York: Dover, 1974.
[2] D. Angluin, "Queries and concept learning,"Machine Learning, vol. 2, pp. 319342, 1988.
[3] T. M. Mitchell, "Generalization as search,"Artificial Intell., vol. 18, pp. 203226, 1982.
[4] A. Blumer, A. Ehrenfeucht, D. Haussler, and M. K. Warmuth, "Occam's razor,"Inform. Processing Lett., vol. 24, pp. 377380, 1987.
[5] L. Devroye, "Automatic pattern recognition: A study of the probability of error,"IEEE Trans. Patt. Anal. Machine Intell., vol. 10, no. 4, pp. 530543, 1988.
[6] L. Devroye, "Any discrimination rule can have an arbitrarily bad probability of error for finite sample size,"IEEE Trans. Patt. Anal. Machine Intell., vol. PAMI2, no. 2, pp. 154157, 1982.
[7] D. Haussler, "Quantifying inductive biasAI learning algorithms and Valiant's learning framework,"Artificial Intell., vol. 36, pp. 177221, 1988.
[8] D. Haussler, "Special issue on theoretical aspects of machine learning,"Machine Learning, 1988.
[9] W. Hoeffding, "Probability inequalities for sums of bounded random variables,"J. Amer. Stat. Assoc., vol. 58, pp. 1330, 1963.
[10] S. Raudys and V. Pikelis, "On dimensionality, sample size, classification error, and complexity of classification algorithm in pattern recognition,"IEEE Trans. Part. Anal. Machine Intell., vol. PAMI2, no. 3, pp. 242252, 1980.
[11] E. Sturt, "Computerized construction in fortran of a discriminant function for categorical data,"Appl. Stat., vol. 30, pp. 213222, 1981.
[12] K. Fukunaga and R. Hayes. "Estimation of classifier performance,"IEEE Trans. Patt. Anal. Machine Intell., vol. 11, no. 10, pp. 10871101, 1989.
[13] D. M. Foley, "Considerations of sample and feature size,"IEEE Trans. Inform. Theory, vol. IT18, pp. 618626, 1972.
[14] J. Pearl, "Capacity and error estimates for Boolean classifiers with limited complexity,"IEEE Trans. Patt. Anal. Machine Intell., vol. PAMI 1, no. 4, pp. 350355, 1979.
[15] J. Pearl, "On the connection between the complexity and the credibility of inferred models,"Int. J. General Syst., vol. 4, pp. 255264, 1978.
[16] N. Pippenger, "Information theory and the complexity of Boolean functions,"Math. Syst. Theory, vol. 10, pp. 124162, 1977.
[17] S. Watanabe,Knowing and GuessingA Formal and Qualitative Study. New York: Wiley, 1969.
[18] G. T. Toussaint, "Bibliography on estimation of misclassification,"IEEE Trans. Inform. Theory, vol. 20, pp. 472479, 1974.
[19] J. V. Uspensky,Introduction to Mathematical Probability. New York: McGrawHill, 1974.
[20] L. G. Valiant, "A theory of the learnable,"Comm. ACM, vol. 27, pp. 11341142, Nov. 1984.
[21] V. Vapnik,Estimation of Dependencies Based on Empirical Data. New York: SpringerVerlag, 1982.
[22] V. N. Vapnik and Y. A. Chervonenkis, "Necessary and sufficient conditions for the uniform convergence of means to their expectations,"Theory Probability Applications, vol. 26, pp. 532553, 1981.
[23] V. N. Vapnik, "On the uniform convergence of relative frequencies of events to their probabilities,"Theory Probability Applications, vol. 16, pp. 264280, 1971.
[24] F. Bergadano, A. Giordana, and L. Saitta, "Automated concept acquisition in noisy environments,"IEEE Trans. Patt. Anal. Machine Intell., vol. 10, no. 4, pp. 555577, 1988.
[25] D. J. Hand, "Recent advances in error rate estimation,"Pattern Recog. Lett., vol. 5, pp. 335346, 1986.