Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Propositional Logic
- 3 Probability Calculus
- 4 Bayesian Networks
- 5 Building Bayesian Networks
- 6 Inference by Variable Elimination
- 7 Inference by Factor Elimination
- 8 Inference by Conditioning
- 9 Models for Graph Decomposition
- 10 Most Likely Instantiations
- 11 The Complexity of Probabilistic Inference
- 12 Compiling Bayesian Networks
- 13 Inference with Local Structure
- 14 Approximate Inference by Belief Propagation
- 15 Approximate Inference by Stochastic Sampling
- 16 Sensitivity Analysis
- 17 Learning: The Maximum Likelihood Approach
- 18 Learning: The Bayesian Approach
- A Notation
- B Concepts from Information Theory
- C Fixed Point Iterative Methods
- D Constrained Optimization
- Bibliography
- Index
13 - Inference with Local Structure
Published online by Cambridge University Press: 23 February 2011
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Propositional Logic
- 3 Probability Calculus
- 4 Bayesian Networks
- 5 Building Bayesian Networks
- 6 Inference by Variable Elimination
- 7 Inference by Factor Elimination
- 8 Inference by Conditioning
- 9 Models for Graph Decomposition
- 10 Most Likely Instantiations
- 11 The Complexity of Probabilistic Inference
- 12 Compiling Bayesian Networks
- 13 Inference with Local Structure
- 14 Approximate Inference by Belief Propagation
- 15 Approximate Inference by Stochastic Sampling
- 16 Sensitivity Analysis
- 17 Learning: The Maximum Likelihood Approach
- 18 Learning: The Bayesian Approach
- A Notation
- B Concepts from Information Theory
- C Fixed Point Iterative Methods
- D Constrained Optimization
- Bibliography
- Index
Summary
We discuss in this chapter computational techniques for exploiting certain properties of network parameters, allowing one to perform inference efficiently in some situations where the network treewidth can be quite large.
Introduction
We discussed in Chapters 6–8 two paradigms for probabilistic inference based on elimination and conditioning, showing how they lead to algorithms whose time and space complexity are exponential in the network treewidth. These algorithms are often called structure-based since their performance is driven by the network structure and is independent of the specific values attained by network parameters. We also presented in Chapter 11 some CNF encodings of Bayesian networks, allowing us to reduce probabilistic inference to some well-known CNF tasks. The resulting CNFs were also independent of the specific values of network parameters and are therefore also structure-based.
However, the performance of inference algorithms can be enhanced considerably if one exploits the specific values of network parameters. The properties of network parameters that lend themselves to such exploitation are known as parametric or local structure. This type of structure typically manifests in networks involving logical constraints, contextspecific independence, or local models of interaction, such as the noisy-or model discussed in Chapter 5.
In this chapter, we present a number of computational techniques for exploiting local structure that can be viewed as extensions of inference algorithms discussed in earlier chapters. We start in Section 13.2 with an overview of local structure and the impact it can have on the complexity of inference.
- Type
- Chapter
- Information
- Modeling and Reasoning with Bayesian Networks , pp. 313 - 339Publisher: Cambridge University PressPrint publication year: 2009