This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Parsing with Probabilistic Strictly Locally Testable Tree Languages
July 2005 (vol. 27 no. 7)
pp. 1040-1050
Probabilistic k-testable models (usually known as k-gram models in the case of strings) can be easily identified from samples and allow for smoothing techniques to deal with unseen events during pattern classification. In this paper, we introduce the family of stochastic k-testable tree languages and describe how these models can approximate any stochastic rational tree language. The model is applied to the task of learning a probabilistic k-testable model from a sample of parsed sentences. In particular, a parser for a natural language grammar that incorporates smoothing is shown.

[1] G. Riccardi, R. Pieraccini, and E. Bocchieri, “Stochastic Automata for Language Modeling,” Computer Speech and Language, vol. 10, no. 4, pp. 265-293, 1996.
[2] J. Hu, W. Turin, and M.K. Brown, “Language Modeling with Stochastic Automata,” Proc. Fourth Int'l Conf. Spoken Language Processing, vol. 1, pp. 406-409, 1996, .
[3] Y. Esteve, F. Bechet, A. Nasr, and R.D. Mori, “Stochastic Finite State Automata Language Model Triggered by Dialogue States,” Proc. Eurospeech, pp. 725-728, 2001.
[4] R. McNaughton and S. Papert, Counter-Free Automata. Cambridge, Mass.: MIT Press, 1971.
[5] P. García and E. Vidal, “Inference of $k\hbox{-}{\rm{Testable}}$ Languages in the Strict Sense and Application to Syntactic Pattern Recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 9, pp. 920-925, Sept. 1990.
[6] T. Yokomori, “On Polynomial-Time Learnability in the Limit of Strictly Deterministic Automata,” Machine Learning, vol. 19, no. 2, pp. 153-179, 1995.
[7] T. Yokomori, N. Ishida, and S. Kobayashi, “Learning Local Languages and Its Application to Protein Alpha-Chain Identification,” Proc. 27th Hawaii Int'l Conf. System Sciences, pp. 113-122, 1994, .
[8] K.W. Church and W. Gale, “A Comparison of the Enhanced Good-Turing and Deleted Estimation Methods for Estimating Probabilities of English Bigrams,” Computer Speech and Language, vol. 5, pp. 19-54, 1991.
[9] H. Ney, U. Essen, and R. Kneser, “On the Estimation of Small Probabilities by Leaving-One-Out,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, no. 12, pp. 1202-1212, Dec. 1995.
[10] E. Charniak, Statistical Language Learning. MIT Press, 1993.
[11] K. Sima'an, R. Bod, S. Krauwer, and R. Scha, “Efficient Disambiguation by Means of Stochastic Tree Substitution Grammars,” Proc. Int'l Conf. New Methods in Language Processing, D.B. Jones and H.L. Somers, eds., pp. 50-58, Sept. 1994.
[12] A. Stolcke, “An Efficient Context-Free Parsing Algorithm that Computes Prefix Probabilities,” Computational Linguistics vol. 21, no. 2, pp. 165-201, 1995.
[13] M. Thorup, “Disambiguating Grammars by Exclusion of Sub-Parse Trees,” Acta Informatica, vol. 33, no. 6, pp. 511-522, 1996.
[14] P. Prescod, “Formalizing XML and SGML Instances with Forest Automata Theory,” technical report, draft technical paper, Dept. of Computer Science, Univ. of Waterloo, Waterloo, Ontario, Canada, 2002.
[15] M. Murata, “Transformation of Documents and Schemas by Patterns and Contextual Conditions,” Proc. Principles of Document Processing, Third Int'l Workshop, C.K. Nicholas and D. Wood, eds., vol. 1293, pp. 153-169, 1997.
[16] J. Hopcroft and J.D. Ullman, Introduction to Automata Theory, Language, and Computation. Reading, Mass.: Addison-Wesley, 1979.
[17] Y. Sakakibara, M. Brown, R.C. Underwood, I.S. Mian, and D. Haussler, “Stochastic Context-Free Grammars for Modeling RNA,” Proc. 27th Ann. Hawaii Int'l Conf. System Sciences, Vol. 5: Biotechnology Computing, L. Hunter, ed., pp. 284-294, Jan. 1994.
[18] F. Gécseg and M. Steinby, Tree Automata. Budapest: Akademiai Kiado, 1984.
[19] Y. Sakakibara, “Efficient Learning of Context-Free Grammars from Positive Structural Examples,” Information and Computation, vol. 97, no. 1, pp. 23-60, Mar. 1992.
[20] M. Nivat and A. Podelski, “Minimal Ascending and Descending Tree Automata,” SIAM J. Computing, vol. 26, no. 1, pp. 39-58, 1997.
[21] T. Knuutila, “Inference of $k\hbox{-}{\rm{Testable}}$ Tree Languages,” Proc. Int'l Workshop Structural and Syntactic Pattern Recognition, Advances in Structural and Syntactic Pattern Recognition, H. Bunke, ed., 1993.
[22] E. Black, S. Abney, D. Flickinger, C. Gdaniec, R. Grishman, P. Harrison, D. Hindle, R. Ingria, F. Jelinek, J. Klavans, M. Liberman, M. Marcus, S. Roukos, B. Santorini, and T. Strzalkowski, “A Procedure for Quantitatively Comparing the Syntatic Coverage of English Grammars,” Proc. Speech and Natural Language Workshop 1991, pp. 306-311, 1991.
[23] C. Manning and H. Schütze, Foundations of Statistical Natural Language Processing. Cambridge, Mass.: MIT Press, 1999.
[24] A. Radford, M. Atkinson, D. Britain, H. Clahsen, and A. Spencer, Linguistics: An Introduction. Cambridge, U.K.: Cambridge Univ. Press, 1999.
[25] A.V. Aho, R. Sethi, and J.D. Ullman, Compilers Principles, Techniques, and Tools. Addison Wesley, 1986.
[26] W.J. Hutchins and H.L. Somers, An Introduction to Machine Translation. New York: Academic Press, 1992.
[27] T.M. Cover and J.A. Thomas, Elements of Information Theory. New York: John Wiley & Sons, 1991.
[28] C.S. Wetherell, “Probabilistic Languages: A Review and Some Open Questions,” ACM Computing Surveys, vol. 12, no. 4, pp. 361-379, Dec. 1980.
[29] E. Gold, “Language Identification in the Limit,” Information and Control, vol. 10, pp. 447-474, 1967.
[30] A. Stolcke and J. Segal, “Precise $n\hbox{-}{\rm{Gram}}$ Probabilities from Stochastic Context-Free Grammars,” Technical Report TR-94-007, Int'l Computer Science Inst., Berkeley, Calif., Jan. 1994.
[31] J. Calera-Rubio and R.C. Carrasco, “Computing the Relative Entropy between Regular Tree Languages,” Information Processing Letters, vol. 68, no. 6, pp. 283-289, 1998.
[32] L.R. Rabiner, “A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition,” Proc. IEEE, vol. 77, no. 2, pp. 257-285, Feb. 1989.
[33] R.C. Carrasco and J. Oncina, “LearningDeterministic Regular Grammars from Stochastic Samples in Polynomial Time,” RAIRO (Theoretical Informatics and Applications), vol. 33, no. 1, pp. 1-20, 1999.
[34] R.C. Carrasco, J. Oncina, and J. Calera-Rubio, “Stochastic Inference of Regular Tree Languages,” Machine Learning, vol. 44, nos. 1/2, pp. 185-197, 2001.
[35] L. Frazier and K. Rayner, “Making and Correcting Errors During Sentence Comprehension: Eye Movements in the Analysis of Structurally Ambiguous Sentences,” Cognitive Psychology, vol. 14, pp. 178-210, 1982.
[36] M.P. Marcus, B. Santorini, and M. Marcinkiewicz, “Building a Large Annotated Corpus of English: The Penn Treebank,” Computational Linguistics, vol. 19, pp. 313-330, 1993.
[37] E. Charniak, “Tree-Bank Grammars,” Proc. 13th Nat'l Conf. Artificial Intelligence and the Eighth Innovative Applications of Artificial Intelligence Conf., pp. 1031-1036, 1996.
[38] M. Johnson, “PCFG Models of Linguistic Tree Representations,” Computational Linguistics, vol. 24, no. 4, pp. 613-632, 1998.
[39] E. Black, F. Jelinek, J.D. Lafferty, D.M. Magerman, R.L. Mercer, and S. Roukos, “Towards History-Based Grammars: Using Richer Models for Probabilistic Parsing,” Proc. DARPA Speech and Natural Language Workshop, pp. 31-37, 1992.
[40] R. Bod and R. Scha, “Data-Oriented Language Processing: An Overview,” Technical Report LP-96-13, Dept. of Computational Linguistics, Univ. of Amsterdam, The Netherlands, 1996.
[41] J. Sáchez and J. Benedi, “Consistency of Stochastic Context-Free Grammars from Probabilistic Estimation Based on Growth Transformations,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 9, pp. 1052-1055, Sept. 1997.
[42] Z. Chi and S. Geman, “Estimation of Probabilistic Context-Free Grammars,” Computational Linguistics, vol. 24, no. 2, pp. 299-305, 1998.
[43] J. Chappelier and M. Rajman, “A Generalized CYK Algorithm for Parsing Stochastic CFG,” Proc. Tabulation in Parsing and Deduction, pp. 133-137, Apr. 1998, http://www.asel.udel.edu/icslp/cdrom/vol1/ 996/a996.pdfciteseer.ist.psu.edu/article/ yokomori96learning.htmlftp://ftp.inria.fr/ INRIA/Projects/Atoll/TAPD98chappelier.ps.gz .
[44] J. Carroll, T. Briscoe, and A. Sanfilippo, “Parser Evaluation: A Survey and a New Proposal,” Proc. Int'l Conf. Language Resources and Evaluation, pp. 447-454, 1998.
[45] A. Krotov, R. Gaizauskas, M. Hepple, and Y. Wilks, “Compacting the Penn Treebank Grammar,” Proc. COLING-ACL'98 Joint Conf. (17th Int'l Conf. Computational Linguistics, and 36th Ann. Meeting of the Assoc. Computational Linguistics), pp. 699-703, 1998.
[46] F. Pereira and Y. Schabes, “Inside-Outside Re-Estimation from Partially Bracketed Corpora,” Proc. 30th Ann. Meeting of the ACL, pp. 128-135, 1992.

Index Terms:
Index Terms- Parsing with probabilistic grammars, stochastic learning, tree grammars.
Citation:
Jose Luis Verd?-Mas, Rafael C. Carrasco, Jorge Calera-Rubio, "Parsing with Probabilistic Strictly Locally Testable Tree Languages," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 7, pp. 1040-1050, July 2005, doi:10.1109/TPAMI.2005.144
Usage of this product signifies your acceptance of the Terms of Use.