Issue No. 09 - September (1996 vol. 18)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.537344
<p><b>Abstract</b>—Given a Bayesian network of discrete random variables with a hyper-Dirichlet prior, a method is proposed for assigning Dirichlet priors to the conditional probabilities of structurally different networks. It defines a distance measure between <it>priors</it> which is to be minimized for the assignment process. Intuitively one would expect that if two models' priors are to qualify as being 'close' in some sense, then their posteriors should also be nearby after an observation. However one does not know in advance what will be observed next. Thus we are led to propose an expectation of Kullback-Leibler distances over all possible next observations to define a measure of distance between priors. In conjunction with the additional assumptions of <it>global</it> and <it>local</it> independence of the parameters [<ref rid="bibi090115" type="bib">15</ref>], a number of theorems emerge which are usually taken as reasonable assumptions in the Bayesian network literature. The method is compared to the 'expansion and contraction' algorithm of [<ref rid="bibi090114" type="bib">14</ref>], and is also contrasted with the results obtained in [<ref rid="bibi09017" type="bib">7</ref>] who employ the additional assumption of likelihood equivalence which is not made here. A simple example illustrates the technique.</p>
Bayesian networks, Dirichlet priors, Kullback-Leibler distance, local independence, global independence.
Robert G. Cowell, "On Compatible Priors for Bayesian Networks", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 18, no. , pp. 901-911, September 1996, doi:10.1109/34.537344