The Community for Technology Leaders
Quantitative Evaluation of Systems, International Conference on (2012)
London, United Kingdom United Kingdom
Sept. 17, 2012 to Sept. 20, 2012
ISBN: 978-1-4673-2346-8
pp: 84-93
ABSTRACT
Statistical Model Checking (SMC) is a computationally very efficient verification technique based on selective system sampling. One well identified shortcoming of SMC is that, unlike probabilistic model checking, it cannot be applied to systems featuring nondeterminism, such as Markov Decision Processes (MDP). We address this limitation by developing an algorithm that resolves nondeterminism probabilistically, and then uses multiple rounds of sampling and Reinforcement Learning to provably improve resolutions of nondeterminism with respect to satisfying a Bounded Linear Temporal Logic (BLTL) property. Our algorithm thus reduces an MDP to a fully probabilistic Markov chain on which SMC may be applied to give an approximate solution to the problem of checking the probabilistic BLTL property. We integrate our algorithm in a parallelised modification of the PRISM simulation framework. Extensive validation with both new and PRISM benchmarks demonstrates that the approach scales very well in scenarios where symbolic algorithms fail to do so.
INDEX TERMS
Markov processes, Probabilistic logic, Optimization, Probability, Heuristic algorithms, Computational modeling, Bounded Linear Temporal Logic, Statistical Model Checking, Markov Decision Processes, Approximate Planning
CITATION

D. Henriques, J. G. Martins, P. Zuliani, A. Platzer and E. M. Clarke, "Statistical Model Checking for Markov Decision Processes," Quantitative Evaluation of Systems, International Conference on(QEST), London, United Kingdom United Kingdom, 2012, pp. 84-93.
doi:10.1109/QEST.2012.19
94 ms
(Ver 3.3 (11022016))