Book contents
- Frontmatter
- Contents
- Preface
- 1 Events and Probability
- 2 Discrete Random Variables and Expectation
- 3 Moments and Deviations
- 4 Chernoff Bounds
- 5 Balls, Bins, and Random Graphs
- 6 The Probabilistic Method
- 7 Markov Chains and Random Walks
- 8 Continuous Distributions and the Poisson Process
- 9 Entropy, Randomness, and Information
- 10 The Monte Carlo Method
- 11 Coupling of Markov Chains
- 12 Martingales
- 13 Pairwise Independence and Universal Hash Functions
- 14 Balanced Allocations
- Further Reading
- Index
2 - Discrete Random Variables and Expectation
- Frontmatter
- Contents
- Preface
- 1 Events and Probability
- 2 Discrete Random Variables and Expectation
- 3 Moments and Deviations
- 4 Chernoff Bounds
- 5 Balls, Bins, and Random Graphs
- 6 The Probabilistic Method
- 7 Markov Chains and Random Walks
- 8 Continuous Distributions and the Poisson Process
- 9 Entropy, Randomness, and Information
- 10 The Monte Carlo Method
- 11 Coupling of Markov Chains
- 12 Martingales
- 13 Pairwise Independence and Universal Hash Functions
- 14 Balanced Allocations
- Further Reading
- Index
Summary
In this chapter, we introduce the concepts of discrete random variables and expectation and then develop basic techniques for analyzing the expected performance of algorithms. We apply these techniques to computing the expected running time of the wellknown Quicksort algorithm. In analyzing two versions of Quicksort, we demonstrate the distinction between the analysis of randomized algorithms, where the probability space is defined by the random choices made by the algorithm, and the probabilistic analysis of deterministic algorithms, where the probability space is defined by some probability distribution on the inputs.
Along the way we define the Bernoulli, binomial, and geometric random variables, study the expected size of a simple branching process, and analyze the expectation of the coupon collector's problem – a probabilistic paradigm that reappears throughout the book.
Random Variables and Expectation
When studying a random event, we are often interested in some value associated with the random event rather than in the event itself. For example, in tossing two dice we are often interested in the sum of the two dice rather than the separate value of each die. The sample space in tossing two dice consists of 36 events of equal probability, given by the ordered pairs of numbers {(1, 1), (1, 2), …, (6, 5), (6, 6)}. If the quantity we are interested in is the sum of the two dice, then we are interested in 11 events (of unequal probability): the 11 possible outcomes of the sum.
- Type
- Chapter
- Information
- Probability and ComputingRandomized Algorithms and Probabilistic Analysis, pp. 20 - 43Publisher: Cambridge University PressPrint publication year: 2005