
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
L. Bréhélin, O. Gascuel, G. Caraux, "Hidden Markov Models with Patterns to Learn Boolean Vector Sequences and Application to the BuiltIn SelfTest for Integrated Circuits," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 9, pp. 9971008, September, 2001.  
BibTex  x  
@article{ 10.1109/34.955112, author = {L. Bréhélin and O. Gascuel and G. Caraux}, title = {Hidden Markov Models with Patterns to Learn Boolean Vector Sequences and Application to the BuiltIn SelfTest for Integrated Circuits}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {23}, number = {9}, issn = {01628828}, year = {2001}, pages = {9971008}, doi = {http://doi.ieeecomputersociety.org/10.1109/34.955112}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Hidden Markov Models with Patterns to Learn Boolean Vector Sequences and Application to the BuiltIn SelfTest for Integrated Circuits IS  9 SN  01628828 SP997 EP1008 EPD  9971008 A1  L. Bréhélin, A1  O. Gascuel, A1  G. Caraux, PY  2001 VL  23 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
—We present a new model, derived from the Hidden Markov Model (HMM), to learn Boolean vector sequences. Our Hidden Markov Model with Patterns (HMMP) is a simple, hybrid, and interpretable model that uses Boolean patterns to define emission probability distributions attached to states. Vectors consistent with a given pattern are equiprobable, while inconsistent ones have probability zero to be emitted. We define an efficient learning algorithm for this model, which relies on the maximum likelihood principle, and proceeds by iteratively simplifying the structure and updating the parameters of an initial specific HMMP that represents the learning sequences. Each simplification involves merging two states of the current HMMP, while keeping the likelihood as high as possible and the algorithm stops when the HMMP has a sufficiently small structure. HMMPs and our learning algorithm are applied to the Builtin SelfTest (BIST) for integrated circuits, which is one of the key microelectronic problems. An HMMP is learned from a test sequence set (computed using a specific tool) that covers most of the potential faults of the circuit at hand. Then, this HMMP is used as test sequence generator. Our experiments, carried out with classical microelectronic benchmark circuits, show that learned HMMPs have a very high fault coverage. Furthermore, their small sizes combined with their simplicity allow these models to be easily implemented on the circuits for selftesting purposes.
[1] L.E. Baum, T. Petrie, G. Soules, and N. Weiss, “A Maximization Technique Occurring in Statistical Analysis of Probabilistic Functions in Markov Chains,” The Annals of Math. Statistics, vol. 41, no. 1, pp. 164171, 1970.
[2] F. Casacuberta, “Some Relations among Stochastic Finite State Networks Used in Automatic Speech Recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 7, pp. 691695, July 1990.
[3] L.R. Bahl, F. Jelinek, and R.L. Mercer, “A Maximum Likelihood Approach to Continuous Speech Recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 5, no. 3, pp. 179190, Mar. 1983.
[4] L.E. Baum, “An Inequality and Associated Maximization Technique in Statistical Estimation for Probabilistic Functions of a Markov Process,” Inequalities, vol. 3, pp. 18, 1972.
[5] A.P. Dempster, N.M. Laird, and D.B. Rubin, “Maximum Likelihood from Incomplete Data via the EM Algorithm,” J. Royal Statistical Soc. B, vol. 39, pp. 138, 1977.
[6] N. Abe and M. Warmuth, “On the Computational Complexity of Approximating Distributions by Probabilistic Automata,” Machine Learning, vol. 9, nos. 23, pp. 205260, 1992.
[7] J.E. Hopcroft and J.D. Ullman, Introduction to Automata Theory, Languages and Computation. AddisonWesley, Apr. 1979.
[8] J. Oncina and P. Garcia, “Inferring Regular Languages in Polynomial Updated Time,” Pattern Recognition and Image Analysis, pp. 4961, 1992.
[9] R.C. Carrasco and J. Oncina, “Learning Stochastic Regular Grammars by Means of a State Merging Method,” Proc. Second Int'l ICGI Colloqium Grammatical Inference and Applications, vol. 862, pp. 139152, 1994.
[10] A. Stolcke and S. Omohundro, “Inducing Probabilistic Grammars by Bayesian Model Merging,” Proc. Second Int'l ICGI Colloqium Grammatical Inference and Applications, vol. 862, pp. 106118, 1994.
[11] K.J. Lang, B.A. Pearlmutter, and R.A. Price, “Results of the Abbadingo One DFA Learning Competition and New EvidenceDriven State Merging Algorithm,” Proc. Fourth Int'l Colloquium Grammatical Inference (ICGI '98), pp. 112, July 1998.
[12] L.R. Rabiner, “Tutorial on Hidden Markov Model and Selected Applications in Speech Recognition,” Proc. IEEE, vol. 77, no. 2, pp. 257285, 1989.
[13] A. Kondratyev, M. Kishinevsky, B. Lin, P. Vanbekbergen, and A. Yakovlev, "Basic Gate Implementation of SpeedIndependent Circuits," Proc. 31th Design Automation Conf., pp. 5662, 1994.
[14] M.Y. Chen, A. Kundu, and J. Zhou, “OffLine Handwritten Word Recognition Using a Hidden Markov Model Type Stochastic Network,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, no. 5, pp. 481496, 1994.
[15] R. Durbin, S. Eddy, A. Krogh, and G. Mitchison, Biological Sequence Analysis Probabilistic Models of Proteins and Nucleic Acids. Cambridge Univ. Press, 1998.
[16] B. Povlow and S. Dunn, “Texture Classification Using Noncausal Hidden Markov Models,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, no. 10, pp. 10101014, Oct. 1995.
[17] C. Raphael, “Automatic Segmentation of Acoustic Musical Signals Using Hidden Markov Models,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 21, no. 4, pp. 360370, Apr. 1998.
[18] J. Rajski and J. Tysze, "Arithmetic BuiltIn SelfTest for Embedded Systems," Prentice Hall,Upper Saddle River, New Jersey, 1997.
[19] O.H. Ibarra and S.K. Sahni, “Polynomially Complete Fault Detection Problems,” IEEE Trans. Computers, vol. 24, 1975.
[20] F. Corno, P. Prinetto, M. Rebaudengo, and M. Sonza Reorda, “A Parallel Genetic Algorithm for Automatic Generation of Test Sequences for Digital Circuits,” Lecture Notes in Computer Science, vol. 1067, pp. 454459, 1996.
[21] A. Ghosh, S. Devadas, and A. Newton, Sequential Logic Testing and Verification. Kluwer, 1992.
[22] V.D. Agrawal, K.T. Cheng, and P. Agrawal, “A Directed Search Method for Test Generation Using a Concurrent Simulator,” IEEE Trans. ComputerAided Design, vol. 8, pp. 131138, Feb. 1989.
[23] L. Bréhélin, O. Gascuel, G. Caraux, P. Girard, and C. Landrault, “Hidden Markov and Independence Models with Patterns for Sequential BIST,” Proc. 18th IEEE VLSI Test Symp., pp. 359367, 2000.
[24] G.D. Forney, “The Viterbi Algorithm,” Proc. IEEE, vol. 61, pp. 268278, Mar. 1973.
[25] B.H. Juang and L.R. Rabiner, The Segmental kMeans Algorithm for Estimating Parameters of Hidden Markov Models IEEE Trans. Acoustics, Speech, and Signal Processing, vol. 38, pp. 16391641, 1990.
[26] F. Brglez, D. Bryan, and K. Kozminski, "Combinatorial Profiles of Sequential Benchmark Circuits," Proc. IEEE Int'l. Symp. Circuits and Systems, IEEE Computer Soc. Press, Los Alamitos, Calif., 1989, pp. 19291934.
[27] O. Gascuel and the SYMENU group,“Twelve Numerical, Symbolic and Hybrid Supervised Classification Methods,” Int'l J. Pattern Recognition and Artificial Intelligence, pp. 517572, 1998.