Part of Advances in Neural Information Processing Systems 11 (NIPS 1998)
Michael Kearns, Lawrence Saul
We study probabilistic inference in large, layered Bayesian net(cid:173) works represented as directed acyclic graphs. We show that the intractability of exact inference in such networks does not preclude their effective use. We give algorithms for approximate probabilis(cid:173) tic inference that exploit averaging phenomena occurring at nodes with large numbers of parents. We show that these algorithms compute rigorous lower and upper bounds on marginal probabili(cid:173) ties of interest, prove that these bounds become exact in the limit of large networks, and provide rates of convergence.