Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Propositional Logic
- 3 Probability Calculus
- 4 Bayesian Networks
- 5 Building Bayesian Networks
- 6 Inference by Variable Elimination
- 7 Inference by Factor Elimination
- 8 Inference by Conditioning
- 9 Models for Graph Decomposition
- 10 Most Likely Instantiations
- 11 The Complexity of Probabilistic Inference
- 12 Compiling Bayesian Networks
- 13 Inference with Local Structure
- 14 Approximate Inference by Belief Propagation
- 15 Approximate Inference by Stochastic Sampling
- 16 Sensitivity Analysis
- 17 Learning: The Maximum Likelihood Approach
- 18 Learning: The Bayesian Approach
- A Notation
- B Concepts from Information Theory
- C Fixed Point Iterative Methods
- D Constrained Optimization
- Bibliography
- Index
6 - Inference by Variable Elimination
Published online by Cambridge University Press: 23 February 2011
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Propositional Logic
- 3 Probability Calculus
- 4 Bayesian Networks
- 5 Building Bayesian Networks
- 6 Inference by Variable Elimination
- 7 Inference by Factor Elimination
- 8 Inference by Conditioning
- 9 Models for Graph Decomposition
- 10 Most Likely Instantiations
- 11 The Complexity of Probabilistic Inference
- 12 Compiling Bayesian Networks
- 13 Inference with Local Structure
- 14 Approximate Inference by Belief Propagation
- 15 Approximate Inference by Stochastic Sampling
- 16 Sensitivity Analysis
- 17 Learning: The Maximum Likelihood Approach
- 18 Learning: The Bayesian Approach
- A Notation
- B Concepts from Information Theory
- C Fixed Point Iterative Methods
- D Constrained Optimization
- Bibliography
- Index
Summary
We present in this chapter one of the simplest methods for general inference in Bayesian networks, which is based on the principle of variable elimination: A process by which we successively remove variables from a Bayesian network while maintaining its ability to answer queries of interest.
Introduction
We saw in Chapter 5 how a number of real-world problems can be solved by posing queries with respect to Bayesian networks. We also identified four types of queries: probability of evidence, prior and posterior marginals, most probable explanation (MPE), and maximum a posterior hypothesis (MAP). We present in this chapter one of the simplest inference algorithms for answering these types of queries, which is based on the principle of variable elimination. Our interest here will be restricted to computing the probability of evidence and marginal distributions, leaving the discussion of MPE and MAP queries to Chapter 10.
We start in Section 6.2 by introducing the process of eliminating a variable. This process relies on some basic operations on a class of functions known as factors, which we discuss in Section 6.3. We then introduce the variable elimination algorithm in Section 6.4 and see how it can be used to compute prior marginals in Section 6.5. The performance of variable elimination will critically depend on the order in which we eliminate variables. We discuss this issue in Section 6.6, where we also provide some heuristics for choosing good elimination orders.
- Type
- Chapter
- Information
- Modeling and Reasoning with Bayesian Networks , pp. 126 - 151Publisher: Cambridge University PressPrint publication year: 2009