2012 IEEE 12th International Conference on Data Mining (2012)
Brussels, Belgium Belgium
Dec. 10, 2012 to Dec. 13, 2012
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/ICDM.2012.96
Markov logic is a knowledge-representation language that allows one to specify large graphical models. However, the resulting large graphical models can make inference for Markov logic a computationally challenging problem. Recently, dual decomposition (DD) has become a popular approach for scalable inference on graphical models. We study how to apply DD to scale up inference in Markov logic. A standard approach for DD first partitions a graphical model into multiple tree-structured sub problems. We apply this approach to Markov logic and show that DD can outperform prior inference approaches. Nevertheless, we observe that the standard approach for DD is suboptimal as it does not exploit the rich structure often present in the Markov logic program. Thus, we describe a novel decomposition strategy that partitions a Markov logic program into parts based on its structure. A crucial advantage of our approach is that we can use specialized algorithms for portions of the input problem -- some of which have been studied for decades, e.g., coreference resolution. Empirically, we show that our program-level decomposition approach outperforms both non-decomposition and graphical model-based decomposition approaches to Markov logic inference on several data-mining tasks.
F. Niu, C. Zhang, C. Re and J. Shavlik, "Scaling Inference for Markov Logic via Dual Decomposition," 2012 IEEE 12th International Conference on Data Mining(ICDM), Brussels, Belgium Belgium, 2012, pp. 1032-1037.