
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Enrique Vidal, Franck Thollard, Colin de la Higuera, Francisco Casacuberta, Rafael C. Carrasco, "Probabilistic FiniteState MachinesPart I," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 7, pp. 10131025, July, 2005.  
BibTex  x  
@article{ 10.1109/TPAMI.2005.147, author = {Enrique Vidal and Franck Thollard and Colin de la Higuera and Francisco Casacuberta and Rafael C. Carrasco}, title = {Probabilistic FiniteState MachinesPart I}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {27}, number = {7}, issn = {01628828}, year = {2005}, pages = {10131025}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2005.147}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Probabilistic FiniteState MachinesPart I IS  7 SN  01628828 SP1013 EP1025 EPD  10131025 A1  Enrique Vidal, A1  Franck Thollard, A1  Colin de la Higuera, A1  Francisco Casacuberta, A1  Rafael C. Carrasco, PY  2005 KW  Index Terms Automata KW  classes defined by grammars or automata KW  machine learning KW  language acquisition KW  language models KW  language parsing and understanding KW  machine translation KW  speech recognition and synthesis KW  structural pattern recognition KW  syntactic pattern recognition. VL  27 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] A. Paz, Introduction to Probabilistic Automata. New York: Academic Press, 1971.
[2] L. Rabiner, “A Tutorial n Hidden Markov Models and Selected Applications in Speech Recoginition,” Proc. IEEE, vol. 77, pp. 257286, 1989.
[3] F. Jelinek, Statistical Methods for Speech Recognition. Cambridge, Mass.: MIT Press, 1998.
[4] R. Carrasco and J. Oncina, “Learning Stochastic Regular Grammars by Means of a State Merging Method,” Proc. Second Int'l Colloquium Grammatical Inference and Applications, pp. 139152, 1994.
[5] L. Saul and F. Pereira, “Aggregate and MixedOrder Markov Models for Statistical Language Processing,” Proc. Second Conf. Empirical Methods in Natural Language Processing, pp. 8189, 1997.
[6] H. Ney, S. Martin, and F. Wessel, “Statistical Language Modeling Using LeavingOneOut,” CorpusBased Statistical Methods in Speech and Language Processing, S. Young and G. Bloothooft, eds., pp. 174207, Kluwer Academic, 1997.
[7] D. Ron, Y. Singer, and N. Tishby, “Learning Probabilistic Automata with Variable Memory Length,” Proc. Seventh Ann. ACM Conf. Computational Learning Theory, pp. 3546, 1994.
[8] M. Mohri, “FiniteState Transducers in Language and Speech Processing,” Computational Linguistics, vol. 23, no. 3, pp. 269311, 1997.
[9] K.S. Fu, Syntactic Pattern Recognition and Applications. Prentice Hall, 1982.
[10] L. Miclet, Structural Methods in Pattern Recognition. SpringerVerlag, 1987.
[11] S. Lucas, E. Vidal, A. Amari, S. Hanlon, and J.C. Amengual, “A Comparison of Syntactic and Statistical Techniques for OffLine OCR,” Proc. Second Int'l Colloquium on Grammatical Inference, pp. 168179, 1994.
[12] D. Ron, Y. Singer, and N. Tishby, “On the Learnability and Usage of Acyclic Probabilistic Finite Automata,” Proc. Eighth Ann. Conf. Computational Learning Theory, pp. 3140, 1995.
[13] H. Ney, “Stochastic Grammars and Pattern Recognition,” Proc. NATO Advanced Study Inst. “Speech Recognition and Understanding. Recent Advances, Trends, and Applications,” pp. 313344, 1992.
[14] N. Abe and H. Mamitsuka, “Predicting Protein Secondary Structure Using Stochastic Tree Grammars,” Machine Learning, vol. 29, pp. 275301 1997
[15] Y. Sakakibara, M. Brown, R. Hughley, I. Mian, K. Sjolander, R. Underwood, and D. Haussler, “Stochastic ContextFree Grammars for tRNA Modeling,” Nuclear Acids Research, vol. 22, pp. 51125120, 1994.
[16] R.B. Lyngsø, C.N.S. Pedersen, and H. Nielsen, “Metrics and Similarity Measures for Hidden Markov Models,” Proc. Intelligent Systems for Molecular Biology, 1999.
[17] R.B. Lyngsø and C.N.S. Pedersen, “Complexity of Comparing Hidden Markov Models,” Proc. 12th Ann. Int'l Symp. Algorithms and Computation, 2001.
[18] P. Cruz and E. Vidal, “Learning Regular Grammars to Model Musical Style: Comparing Different Coding Schemes,” Proc. Int'l Colloquium on Grammatical Inference, pp. 211222, 1998.
[19] M.G. Thomason, “Regular Stochastic SyntaxDirected Translations,” Technical Report CS7617, Computer Science Dept., Univ. of Tennessee, K noxville, 1976.
[20] M. Mohri, F. Pereira, and M. Riley, “The Design Principles of a Weighted FiniteState Transducer Library,” Theoretical Computer Science, vol. 231, pp. 1732, 2000.
[21] H. Alshawi, S. Bangalore, and S. Douglas, “Learning Dependency Translation Models as Collections of Finite State Head Transducers,” Computational Linguistics, vol. 26, 2000.
[22] H. Alshawi, S. Bangalore, and S. Douglas, “Head Transducer Model for Speech Translation and their Automatic Acquisition from Bilingual Data,” Machine Translation J., vol. 15, nos. 12, pp. 105124, 2000.
[23] J.C. Amengual, J.M. Benedí, F. Casacuberta, A.C. No, A. Castellanos, V.M. Jimenez, D. Llorens, A. Marzal, M. Pastor, F. Prat, E. Vidal, and J.M. Vilar, “The EUTRANSI Speech Translation System,” Machine Translation J., vol. 15, no. 12, pp. 75103, 2000.
[24] S. Bangalore and G. Riccardi, “Stochastic FiniteState Models for Spoken Language Machine Translation,” Proc. Workshop Embedded Machine Translation Systems, NAACL, pp. 5259, May 2000.
[25] S. Bangalore and G. Riccardi, “A FiniteState Approach to Machine Translation,” Proc. North Am. Assoc. Computational Linguistics, May 2001.
[26] F. Casacuberta, H. Ney, F.J. Och, E. Vidal, J.M. Vilar, S. Barrachina, I. GarciaVarea, D. Llorens, C. Martinez, S. Molau, F. Nevado, M. Pastor, D. Picó, A. Sanchis, and C. Tillmann, “Some Approaches to Statistical and FiniteState SpeechtoSpeech Translation,” Computer Speech and Language, 2003.
[27] L. Bréhélin, O. Gascuel, and G. Caraux, “Hidden Markov Models with Patterns to Learn Boolean Vector Sequences and Application to the BuiltIn SelfTest for Integrated Circuits,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 9, pp. 9971008, Sept. 2001.
[28] Y. Bengio, V.P. Lauzon, and R. Ducharme, “Experiments on the Application of IOHMMs to Model Financial Returns Series,” IEEE Trans. Neural Networks, vol. 12, no. 1, pp. 113123, 2001.
[29] K.S. Fu, Syntactic Methods in Pattern Recognition. NewYork: Academic Press, 1974.
[30] J.J. Paradaens, “A General Definition of Stochastic Automata,” Computing, vol. 13, pp. 93105, 1974.
[31] K.S. Fu and T.L. Booth, “Grammatical Inference: Introduction and Survey Parts I and II,” IEEE Trans. Systems, Man, and Cybernetics, vol. 5, pp. 5972 and pp. 409423, 1975.
[32] C.S. Wetherell, “Probabilistic Languages: A Review and Some Open Questions,” Computing Surveys, vol. 12, no. 4, 1980.
[33] F. Casacuberta, “Some Relations among Stochastic Finite State Networks Used in Automatic Speech Recogntion,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 7, pp. 691695, July 1990.
[34] D. Angluin, “Identifying Languages from Stochastic Examples,” Technical Report YALEU/DCS/RR614, Yale Univ., Mar. 1988
[35] M. Kearns and L. Valiant, “Cryptographic Limitations on Learning Boolean Formulae and Finite Automata,” Proc. 21st ACM Symp. Theory of Computing, pp. 433444, 1989.
[36] M. Kearns, Y. Mansour, D. Ron, R. Rubinfeld, R.E. Schapire, and L. Sellie, “On the Learnability of Discrete Distributions,” Proc. 25th Ann. ACM Symp. Theory of Computing, pp. 273282, 1994.
[37] M. Kearns and U. Vazirani, An Introduction to Computational Learning Theory. MIT Press, 1994.
[38] N. Abe and M. Warmuth, “On the Computational Complexity of Approximating Distributions by Probabilistic Automata,” Proc. Third Workshop Computational Learning Theory, pp. 5266, 1998.
[39] P. Dupont, F. Denis, and Y. Esposito, “Links between Probabilistic Automata and Hidden Markov Models: Probability Distributions, Learning Models and Induction Algorithms,” Pattern Recognition, 2004.
[40] E. Vidal, F. Thollard, C. de la Higuera, F. Casacuberta, and R.C. Carrasco, “Probabilistic FiniteState Automata— Part II,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 7, pp. 10261039, July 2005.
[41] M.O. Rabin, “Probabilistic Automata,” Information and Control, vol. 6, no. 3, pp. 230245, 1963.
[42] G.D. Forney, “The Viterbi Algorithm,” IEEE Proc., vol. 3, pp. 268278, 1973.
[43] F. Casacuberta and C. de la Higuera, “Computational Complexity of Problems on Probabilistic Grammars and Transducers,” Proc. Fifth Int'l Colloquium on Grammatical Inference, pp. 1524, 2000.
[44] R.C. Carrasco, “Accurate Computation of the Relative Entropy between Stochastic Regular Grammars,” RAIROTheoretical Informatics and Applications, vol. 31, no. 5, pp. 437444, 1997.
[45] W.G. Tzeng, “A PolynomialTime Algorithm for the Equivalence of Probabilistic Automata,” SIAM J. Computing, vol. 21, no. 2, pp. 216227, 1992.
[46] A. Fred, “Computation of Substring Probabilities in Stochastic Grammars,” Proc. Fifth Int'l Colloquium Grammatical Inference: Algorithms and Applications, pp. 103114, 2000.
[47] M. YoungLai and F.W. Tompa, “Stochastic Grammatical Inference of Text Database Structure,” Machine Learning, vol. 40, no. 2, pp. 111137, 2000.
[48] D. Ron and R. Rubinfeld, “Learning Fallible Deterministic Finite Automata,” Machine Learning, vol. 18, pp. 149185, 1995.
[49] C. Cook and A. Rosenfeld, “Some Experiments in Grammatical Inference,” NATO ASI Computer Orientation Learning Process, pp. 157171, 1974.
[50] K. Knill and S. Young, “Hidden Markov Models in Speech and Language Processing,” CorpusBased Statistical Methods in Speech and Language Processing. S. Young and G. Bloothoof, eds., Kluwer Academic, pp. 2768, 1997.
[51] N. Merhav and Y. Ephraim, “Hidden Markov Modeling Using a Dominant State Sequence with Application to Speech Recognition,” Computer Speech and Language, vol. 5, pp. 327339, 1991.
[52] N. Merhav and Y. Ephraim, “Maximum Likelihood Hidden Markov Modeling Using a Dominant State Sequence of States,” IEEE Trans, Signal Processing, vol. 39, no. 9, pp. 21112115, 1991.
[53] R.G. Galleguer, Discrete Stochastic Processes. Kluwer Academic, 1996.
[54] V.C.V.D. Blondel, “Undecidable Problems for Probabilistic Automata of Fixed Dimension,” Theory of Computing Systems, vol. 36, no. 3, pp. 231245, 2003.
[55] M.H. Harrison, Introduction to Formal Language Theory. Reading, Mass.: AddisonWesley, 1978.
[56] C. de la Higuera, “Characteristic Sets for Polynomial Grammatical Inference,” Machine Learning, vol. 27, pp. 125138, 1997.
[57] R. Carrasco and J. Oncina, “Learning Deterministic Regular Grammars from Stochastic Samples in Polynomial Time,” RAIROTheoretical Informatics and Applications, vol. 33, no. 1, pp. 120, 1999.
[58] C. de la Higuera, “Why $\epsilon\hbox{}{\rm{Transitions}}$ Are Not Necessary in Probabilistic Finite Automata,” Technical Report 0301, EURISE, Univ. of SaintEtienne, 2003.
[59] T. Cover and J. Thomas, Elements of Information Theory. Wiley Interscience, 1991.
[60] J. Goodman, “A Bit of Progress in Language Modeling,” technical report, Microsoft Research, 2001.
[61] R. Kneser and H. Ney, “Improved Clustering Techniques for ClassBased Language Modelling,” Proc. European Conf. Speech Comm. And Technology, pp. 973976, 1993.
[62] P. Brown, V. Della Pietra, P. deSouza, J. Lai, and R. Mercer, “ClassBased NGram Models of Natural Language,” Computational Linguistics, vol. 18, no. 4, pp. 467479, 1992.