Book contents
- Frontmatter
- Dedication
- Contents
- Preface to the Second Edition
- Preface to the First Edition
- Acknowledgments for the Second Edition
- Acknowledgments for the First Edition
- 1 Some Essential Notation
- 2 Signals, Integrals, and Sets of Measure Zero
- 3 The Inner Product
- 4 The Space L2 of Energy-Limited Signals
- 5 Convolutions and Filters
- 6 The Frequency Response of Filters and Bandlimited Signals
- 7 Passband Signals and Their Representation
- 8 Complete Orthonormal Systems and the Sampling Theorem
- 9 Sampling Real Passband Signals
- 10 Mapping Bits to Waveforms
- 11 Nyquist's Criterion
- 12 Stochastic Processes: Definition
- 13 Stationary Discrete-Time Stochastic Processes
- 14 Energy and Power in PAM
- 15 Operational Power Spectral Density
- 16 Quadrature Amplitude Modulation
- 17 Complex Random Variables and Processes
- 18 Energy, Power, and PSD in QAM
- 19 The Univariate Gaussian Distribution
- 20 Binary Hypothesis Testing
- 21 Multi-Hypothesis Testing
- 22 Sufficient Statistics
- 23 The Multivariate Gaussian Distribution
- 24 Complex Gaussians and Circular Symmetry
- 25 Continuous-Time Stochastic Processes
- 26 Detection in White Gaussian Noise
- 27 Noncoherent Detection and Nuisance Parameters
- 28 Detecting PAM and QAM Signals in White Gaussian Noise
- 29 Linear Binary Block Codes with Antipodal Signaling
- 30 The Radar Problem
- 31 A Glimpse at Discrete-Time Signal Processing
- 32 Intersymbol Interference
- A On the Fourier Series
- B On the Discrete-Time Fourier Transform
- C Positive Definite Functions
- D The Baseband Representation of Passband Stochastic Processes
- Bibliography
- Theorems Referenced by Name
- Abbreviations
- List of Symbols
- Index
21 - Multi-Hypothesis Testing
Published online by Cambridge University Press: 02 March 2017
- Frontmatter
- Dedication
- Contents
- Preface to the Second Edition
- Preface to the First Edition
- Acknowledgments for the Second Edition
- Acknowledgments for the First Edition
- 1 Some Essential Notation
- 2 Signals, Integrals, and Sets of Measure Zero
- 3 The Inner Product
- 4 The Space L2 of Energy-Limited Signals
- 5 Convolutions and Filters
- 6 The Frequency Response of Filters and Bandlimited Signals
- 7 Passband Signals and Their Representation
- 8 Complete Orthonormal Systems and the Sampling Theorem
- 9 Sampling Real Passband Signals
- 10 Mapping Bits to Waveforms
- 11 Nyquist's Criterion
- 12 Stochastic Processes: Definition
- 13 Stationary Discrete-Time Stochastic Processes
- 14 Energy and Power in PAM
- 15 Operational Power Spectral Density
- 16 Quadrature Amplitude Modulation
- 17 Complex Random Variables and Processes
- 18 Energy, Power, and PSD in QAM
- 19 The Univariate Gaussian Distribution
- 20 Binary Hypothesis Testing
- 21 Multi-Hypothesis Testing
- 22 Sufficient Statistics
- 23 The Multivariate Gaussian Distribution
- 24 Complex Gaussians and Circular Symmetry
- 25 Continuous-Time Stochastic Processes
- 26 Detection in White Gaussian Noise
- 27 Noncoherent Detection and Nuisance Parameters
- 28 Detecting PAM and QAM Signals in White Gaussian Noise
- 29 Linear Binary Block Codes with Antipodal Signaling
- 30 The Radar Problem
- 31 A Glimpse at Discrete-Time Signal Processing
- 32 Intersymbol Interference
- A On the Fourier Series
- B On the Discrete-Time Fourier Transform
- C Positive Definite Functions
- D The Baseband Representation of Passband Stochastic Processes
- Bibliography
- Theorems Referenced by Name
- Abbreviations
- List of Symbols
- Index
Summary
Introduction
In Chapter 20 we discussed how to guess the outcome of a binary random variable. We now extend the discussion to random variables that take on more than two—but still a finite—number of values. Statisticians call this problem “multi-hypothesis testing” to indicate that there may be more than two hypotheses. Rather than using H, we now denote the random variable whose outcome we wish to guess by M. (In Chapter 20 we used H for “hypothesis;” now we use M for “message.”) We denote the number of possible values that M can take by M and assume that M ≥ 2. (The case M = 2 corresponds to binary hypothesis testing.) As before the “labels” are not important and there is no loss in generality in assuming that M takes values in the setM= ﹛1, …, M﹜. (In the binary case we used the traditional labels 0 and 1 but now we prefer 1, 2, …, M.)
The Setup
A random variable M takes values in the set M = ﹛1, …, M﹜, where M ≥ 2, according to the prior
where
and where
We say that the prior is nondegenerate if
with the inequalities being strict, so M can take on any value in M with positive probability. We say that the prior is uniform if
The observation is a random vector Y taking values in Rd. We assume that for each the distribution of Y conditional on M = m has the density
where is a nonnegative Borel measurable function that integrates to one over Rd.
A guessing rule is a Borel measurable function from the space of possible observations Rd to the set of possible messages M. We think about as the guess we form after observing that Y = yobs. The error probability associated with the guessing rule is given by
Note that two sources of randomness determine whether we err or not: the realization of M and the generation of Y conditional on that realization. A guessing rule is said to be optimal if no other guessing rule achieves a lower probability of error.
- Type
- Chapter
- Information
- A Foundation in Digital Communication , pp. 441 - 467Publisher: Cambridge University PressPrint publication year: 2017