The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - February (2011 vol.23)
pp: 235-247
David C. Wilkins , Stanford University, Stanford
Ole J. Mengshoel , Carnegie Mellon University, Moffet Field
ABSTRACT
For hard computational problems, stochastic local search has proven to be a competitive approach to finding optimal or approximately optimal problem solutions. Two key research questions for stochastic local search algorithms are: Which algorithms are effective for initialization? When should the search process be restarted? In the present work, we investigate these research questions in the context of approximate computation of most probable explanations (MPEs) in Bayesian networks (BNs). We introduce a novel approach, based on the Viterbi algorithm, to explanation initialization in BNs. While the Viterbi algorithm works on sequences and trees, our approach works on BNs with arbitrary topologies. We also give a novel formalization of stochastic local search, with focus on initialization and restart, using probability theory and mixture models. Experimentally, we apply our methods to the problem of MPE computation, using a stochastic local search algorithm known as Stochastic Greedy Search. By carefully optimizing both initialization and restart, we reduce the MPE search time for application BNs by several orders of magnitude compared to using uniform at random initialization without restart. On several BNs from applications, the performance of Stochastic Greedy Search is competitive with clique tree clustering, a state-of-the-art exact algorithm used for MPE computation in BNs.
INDEX TERMS
Stochastic local search, Bayesian networks, initialization, restart, finite mixture models.
CITATION
David C. Wilkins, Ole J. Mengshoel, "Initialization and Restart in Stochastic Local Search: Computing a Most Probable Explanation in Bayesian Networks", IEEE Transactions on Knowledge & Data Engineering, vol.23, no. 2, pp. 235-247, February 2011, doi:10.1109/TKDE.2010.98
REFERENCES
[1] J. Pearl, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, 1988.
[2] F.R. Kschischang, B.J. Frey, and H.-A. Loeliger, "Factor Graphs and the Sum-Product Algorithm," IEEE Trans. Information Theory, vol. 47, no. 2, pp. 498-519, Feb. 2001.
[3] M. Wainwright, T. Jaakkola, and A. Willsky, "MAP Estimation via Agreement on (Hyper)Trees: Message-Passing and Linear Programming Approaches," IEEE Trans. Information Theory, vol. 51, no. 11, pp. 3697-3717, Nov. 2002.
[4] M.J. Wainwright, T.S. Jaakkola, and A.S. Willsky, "Tree-Based Reparameterization Framework for Analysis of Sum-Product and Related Algorithms," IEEE Trans. Information Theory, vol. 49, no. 5, pp. 1120-1146, May 2003.
[5] A. Darwiche, "A Differential Approach to Inference in Bayesian Networks," J. ACM, vol. 50, no. 3, pp. 280-305, 2003.
[6] S. Lauritzen and D.J. Spiegelhalter, "Local Computations with Probabilities on Graphical Structures and Their Application to Expert Systems (with Discussion)," J. Royal Statistical Soc. Series B, vol. 50, no. 2, pp. 157-224, 1988.
[7] S.K. Andersen, K.G. Olesen, F.V. Jensen, and F. Jensen, "HUGIN— A Shell for Building Bayesian Belief Universes for Expert Systems," Proc. 11th Int'l Joint Conf. Artificial Intelligence, vol. 2, pp. 1080-1085, Aug. 1989.
[8] C. Huang and A. Darwiche, "Inference in Belief Networks: A Procedural Guide," Int'l J. Approximate Reasoning, vol. 15, pp. 225-263, 1996.
[9] B. Selman, H. Levesque, and D. Mitchell, "A New Method for Solving Hard Satisfiability Problems," Proc. 10th Nat'l Conf. Artificial Intelligence (AAAI '92), pp. 440-446, 1992.
[10] B. Selman, H.A. Kautz, and B. Cohen, "Noise Strategies for Improving Local Search," Proc. 12th Nat'l Conf. Artificial Intelligence (AAAI '94), pp. 337-343, 1994.
[11] P.W. Gu, J. Purdom, J. Franco, and B.W. Wah, "Algorithms for the Satisfiability SAT Problem: A Survey," Satisfiability Problem: Theory and Applications, pp. 19-152. Am. Math. Soc., 1997.
[12] H.H. Hoos and T. Stützle, Stochastic Local Search: Foundations and Applications. Morgan Kaufmann, 2005.
[13] K. Kask and R. Dechter, "Stochastic Local Search for Bayesian Networks," Proc. Seventh Int'l Workshop Artificial Intelligence and Statistics, Jan. 1999.
[14] O.J. Mengshoel, "Efficient Bayesian Network Inference: Genetic Algorithms, Stochastic Local Search, and Abstraction," PhD dissertation, Dept. of Computer Science, Univ. of Illinois at Urbana-Champaign, Apr. 1999.
[15] O.J. Mengshoel, D. Roth, and D.C. Wilkins, "Stochastic Greedy Search: Computing the Most Probable Explanation in Bayesian Networks," Technical Report UIUCDCS-R-2000-2150, Dept. of Computer Science, Univ. of Illinois at Urbana-Champaign, Feb. 2000.
[16] F. Hutter, H.H. Hoos, and T. Stützle, "Efficient Stochastic Local Search for MPE Solving," Proc. 19th Int'l Joint Conf. Artificial Intelligence (IJCAI '05), pp. 169-174, 2005.
[17] O.J. Mengshoel, "Understanding the Role of Noise in Stochastic Local Search: Analysis and Experiments," Artificial Intelligence, vol. 172, nos. 8-9, pp. 955-990, 2008.
[18] O.J. Mengshoel, D. Roth, and D.C. Wilkins, "Portfolios in Stochastic Local Search: Efficiently Computing Most Probable Explanations in Bayesian Networks," J. Automated Reasoning, published online 14 Apr. 2010.
[19] J.D. Park and A. Darwiche, "Complexity Results and Approximation Strategies for MAP Explanations," J. Artificial Intelligence Research, vol. 21, pp. 101-133, 2004.
[20] B. Selman and H. Kautz, "Domain-Independent Extensions to GSAT: Solving Large Structured Satisfiability Problems," Proc. Int'l Joint Conf. Artificial Intelligence (IJCAI '93), pp. 290-295, 1993.
[21] R. Lin, A. Galper, and R. Shachter, "Abductive Inference Using Probabilistic Networks: Randomized Search Techniques," Technical Report KSL-90-73, Knowledge Systems Laboratory, Nov. 1990.
[22] L.R. Rabiner, "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition," Proc. IEEE, vol. 77, no. 2, pp. 257-286, Feb. 1989.
[23] H.H. Hoos and T. Stützle, "Towards a Characterisation of the Behaviour of Stochastic Local Search Algorithms for SAT," Artificial Intelligence, vol. 112, nos. 1-2, pp. 213-232, 1999.
[24] H.H. Hoos and T. Stützle, "Local Search Algorithms for SAT: An Empirical Evaluation," J. Automated Reasoning, vol. 24, no. 4, pp. 421-481, citeseer.ist.psu.eduhoos99local.html, 2000.
[25] D. Mitchell, B. Selman, and H.J. Levesque, "Hard and Easy Distributions of SAT Problems," Proc. 10th Nat'l Conf. Artificial Intelligence (AAAI '92), pp. 459-465, 1992.
[26] H.H. Hoos, "A Mixture-Model for the Behaviour of SLS Algorithms for SAT," Proc. 18th Nat'l Conf. Artificial Intelligence (AAAI '02), pp. 661-667, 2002.
[27] I.P. Gent and T. Walsh, "Easy Problems are Sometimes Hard," Artificial Intelligence, vol. 70, nos. 1-2, pp. 335-345, 1994.
[28] A.J. Parkes and J.P. Walser, "Tuning Local Search for Satisfiability Testing," Proc. 13th Nat'l Conf. Artificial Intelligence (AAAI '96), pp. 356-362, citeseer.ist.psu.eduparkes96tuning.html, 1996.
[29] C.P. Gomes, B. Selman, and H. Kautz, "Boosting Combinatorial Search through Randomization," Proc. 15th Nat'l Conf. Artificial Intelligence (AAAI '98), pp. 431-437, 1998.
[30] E. Horvitz, Y. Ruan, C. Gomes, H. Kautz, B. Selman, and D. Chickering, "A Bayesian Approach to Tackling Hard Computational Problems," Proc. 17th Ann. Conf. Uncertainty in Artificial Intelligence (UAI '01), pp. 235-244, 2001.
[31] Y. Ruan, E. Horvitz, and H. Kautz, "Restart Policies with Dependence among Runs: A Dynamic Programming Approach," Proc. Eighth Int'l Conf. Principles and Practice of Constraint Programming, pp. 573-586, 2002.
[32] Y. Ruan, E. Horvitz, and H. Kautz, "Hardness-Aware Restart Policies," Proc. 18th Int'l Joint Conf. Artificial Intelligence (IJCAI '03) Workshop Stochastic Search Algorithms, 2003.
[33] A.P. Dawid, "Applications of a General Propagation Algorithm for Probabilistic Expert Systems," Statistics and Computing, vol. 2, pp. 25-36, 1992.
[34] E. Shimony, "Finding MAPs for Belief Networks is NP-Hard," Artificial Intelligence, vol. 68, pp. 399-410, 1994.
[35] A.M. Abdelbar and S.M. Hedetnieme, "Approximating MAPs for Belief Networks is NP-Hard and Other Theorems," Artificial Intelligence, vol. 102, pp. 21-38, 1998.
[36] A. Viterbi, "Error Bounds for Convolutional Codes and An Asymptotically Optimal Decoding Algorithm," IEEE Trans. Information Theory, vol. 13, no. 2, pp. 260-269, Apr. 1967.
[37] M. Henrion, "Propagating Uncertainty in Bayesian Networks by Probabilistic Logic Sampling," Uncertainty in Artificial Intelligence, vol. 2, pp. 149-163, Elsevier, 1988.
[38] R. Dechter and I. Rish, "Mini-Buckets: A General Scheme for Bounded Inference," J. ACM, vol. 50, no. 2, pp. 107-153, 2003.
[39] R. Dechter, "Bucket Elimination: A Unifying Framework for Reasoning," Artificial Intelligence, vol. 113, nos. 1-2, pp. 41-85, citeseer.nj.nec.com/articledechter99bucket.html , 1999.
[40] D. Schuurmans and F. Southey, "Local Search Characteristics of Incomplete SAT Procedures," Artificial Intelligence, vol. 132, no. 2, pp. 121-150, citeseer.ist.psu.edu/articleschuurmans00local.html , 2001.
[41] J. Hooker, "Testing Heuristics: We have It All Wrong," J. Heuristics, vol. 1, pp. 33-42, 1996.
[42] R. Motwani and P. Raghavan, Randomized Algorithms. Cambridge Univ. Press, 1995.
[43] B.A. Huberman, R.M. Lukose, and T. Hogg, "An Economics Approach to Hard Computational Problems," Science, vol. 275, no. 3, pp. 51-54, 1997.
[44] P. Jones, C. Hayes, D. Wilkins, R. Bargar, J. Sniezek, P. Asaro, O.J. Mengshoel, D. Kessler, M. Lucenti, I. Choi, N. Tu, and J. Schlabach, "CoRAVEN: Modeling and Design of a Multimedia Intelligent Infrastructure for Collaborative Intelligence Analysis," Proc. IEEE Int'l Conf. Systems, Man, and Cybernetics, pp. 914-919, Oct. 1998.
[45] I.H. Witten and E. Frank, Data Mining: Practical Machine Learning Tools and Techniques, second ed., Morgan Kaufmann, 2005.
[46] H.L. Bodlaender, "A Tourist Guide through Treewidth," Acta Cybernetica, vol. 11, pp. 1-21, citeseer.nj.nec.combodlaender93 tourist.html , 1993.
24 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool