This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Hidden Markov Models with Patterns to Learn Boolean Vector Sequences and Application to the Built-In Self-Test for Integrated Circuits
September 2001 (vol. 23 no. 9)
pp. 997-1008

—We present a new model, derived from the Hidden Markov Model (HMM), to learn Boolean vector sequences. Our Hidden Markov Model with Patterns (HMMP) is a simple, hybrid, and interpretable model that uses Boolean patterns to define emission probability distributions attached to states. Vectors consistent with a given pattern are equiprobable, while inconsistent ones have probability zero to be emitted. We define an efficient learning algorithm for this model, which relies on the maximum likelihood principle, and proceeds by iteratively simplifying the structure and updating the parameters of an initial specific HMMP that represents the learning sequences. Each simplification involves merging two states of the current HMMP, while keeping the likelihood as high as possible and the algorithm stops when the HMMP has a sufficiently small structure. HMMPs and our learning algorithm are applied to the Built-in Self-Test (BIST) for integrated circuits, which is one of the key microelectronic problems. An HMMP is learned from a test sequence set (computed using a specific tool) that covers most of the potential faults of the circuit at hand. Then, this HMMP is used as test sequence generator. Our experiments, carried out with classical microelectronic benchmark circuits, show that learned HMMPs have a very high fault coverage. Furthermore, their small sizes combined with their simplicity allow these models to be easily implemented on the circuits for self-testing purposes.

[1] L.E. Baum, T. Petrie, G. Soules, and N. Weiss, “A Maximization Technique Occurring in Statistical Analysis of Probabilistic Functions in Markov Chains,” The Annals of Math. Statistics, vol. 41, no. 1, pp. 164-171, 1970.
[2] F. Casacuberta, “Some Relations among Stochastic Finite State Networks Used in Automatic Speech Recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 7, pp. 691-695, July 1990.
[3] L.R. Bahl, F. Jelinek, and R.L. Mercer, “A Maximum Likelihood Approach to Continuous Speech Recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 5, no. 3, pp. 179-190, Mar. 1983.
[4] L.E. Baum, “An Inequality and Associated Maximization Technique in Statistical Estimation for Probabilistic Functions of a Markov Process,” Inequalities, vol. 3, pp. 1-8, 1972.
[5] A.P. Dempster, N.M. Laird, and D.B. Rubin, “Maximum Likelihood from Incomplete Data via the EM Algorithm,” J. Royal Statistical Soc. B, vol. 39, pp. 1-38, 1977.
[6] N. Abe and M. Warmuth, “On the Computational Complexity of Approximating Distributions by Probabilistic Automata,” Machine Learning, vol. 9, nos. 2-3, pp. 205-260, 1992.
[7] J.E. Hopcroft and J.D. Ullman, Introduction to Automata Theory, Languages and Computation. Addison-Wesley, Apr. 1979.
[8] J. Oncina and P. Garcia, “Inferring Regular Languages in Polynomial Updated Time,” Pattern Recognition and Image Analysis, pp. 49-61, 1992.
[9] R.C. Carrasco and J. Oncina, “Learning Stochastic Regular Grammars by Means of a State Merging Method,” Proc. Second Int'l ICGI Colloqium Grammatical Inference and Applications, vol. 862, pp. 139-152, 1994.
[10] A. Stolcke and S. Omohundro, “Inducing Probabilistic Grammars by Bayesian Model Merging,” Proc. Second Int'l ICGI Colloqium Grammatical Inference and Applications, vol. 862, pp. 106-118, 1994.
[11] K.J. Lang, B.A. Pearlmutter, and R.A. Price, “Results of the Abbadingo One DFA Learning Competition and New Evidence-Driven State Merging Algorithm,” Proc. Fourth Int'l Colloquium Grammatical Inference (ICGI '98), pp. 1-12, July 1998.
[12] L.R. Rabiner, “Tutorial on Hidden Markov Model and Selected Applications in Speech Recognition,” Proc. IEEE, vol. 77, no. 2, pp. 257-285, 1989.
[13] A. Kondratyev, M. Kishinevsky, B. Lin, P. Vanbekbergen, and A. Yakovlev, "Basic Gate Implementation of Speed-Independent Circuits," Proc. 31th Design Automation Conf., pp. 56-62, 1994.
[14] M.Y. Chen, A. Kundu, and J. Zhou, “Off-Line Handwritten Word Recognition Using a Hidden Markov Model Type Stochastic Network,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, no. 5, pp. 481-496, 1994.
[15] R. Durbin, S. Eddy, A. Krogh, and G. Mitchison, Biological Sequence Analysis Probabilistic Models of Proteins and Nucleic Acids. Cambridge Univ. Press, 1998.
[16] B. Povlow and S. Dunn, “Texture Classification Using Noncausal Hidden Markov Models,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, no. 10, pp. 1010-1014, Oct. 1995.
[17] C. Raphael, “Automatic Segmentation of Acoustic Musical Signals Using Hidden Markov Models,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 21, no. 4, pp. 360-370, Apr. 1998.
[18] J. Rajski and J. Tysze, "Arithmetic Built-In Self-Test for Embedded Systems," Prentice Hall,Upper Saddle River, New Jersey, 1997.
[19] O.H. Ibarra and S.K. Sahni, “Polynomially Complete Fault Detection Problems,” IEEE Trans. Computers, vol. 24, 1975.
[20] F. Corno, P. Prinetto, M. Rebaudengo, and M. Sonza Reorda, “A Parallel Genetic Algorithm for Automatic Generation of Test Sequences for Digital Circuits,” Lecture Notes in Computer Science, vol. 1067, pp. 454-459, 1996.
[21] A. Ghosh, S. Devadas, and A. Newton, Sequential Logic Testing and Verification. Kluwer, 1992.
[22] V.D. Agrawal, K.T. Cheng, and P. Agrawal, “A Directed Search Method for Test Generation Using a Concurrent Simulator,” IEEE Trans. Computer-Aided Design, vol. 8, pp. 131-138, Feb. 1989.
[23] L. Bréhélin, O. Gascuel, G. Caraux, P. Girard, and C. Landrault, “Hidden Markov and Independence Models with Patterns for Sequential BIST,” Proc. 18th IEEE VLSI Test Symp., pp. 359-367, 2000.
[24] G.D. Forney, “The Viterbi Algorithm,” Proc. IEEE, vol. 61, pp. 268-278, Mar. 1973.
[25] B.H. Juang and L.R. Rabiner, The Segmental k-Means Algorithm for Estimating Parameters of Hidden Markov Models IEEE Trans. Acoustics, Speech, and Signal Processing, vol. 38, pp. 1639-1641, 1990.
[26] F. Brglez, D. Bryan, and K. Kozminski, "Combinatorial Profiles of Sequential Benchmark Circuits," Proc. IEEE Int'l. Symp. Circuits and Systems, IEEE Computer Soc. Press, Los Alamitos, Calif., 1989, pp. 1929-1934.
[27] O. Gascuel and the SYMENU group,“Twelve Numerical, Symbolic and Hybrid Supervised Classification Methods,” Int'l J. Pattern Recognition and Artificial Intelligence, pp. 517-572, 1998.

Citation:
L. Bréhélin, O. Gascuel, G. Caraux, "Hidden Markov Models with Patterns to Learn Boolean Vector Sequences and Application to the Built-In Self-Test for Integrated Circuits," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 9, pp. 997-1008, Sept. 2001, doi:10.1109/34.955112
Usage of this product signifies your acceptance of the Terms of Use.