Book contents
- Frontmatter
- Contents
- Introduction
- 1 Hidden Markov processes in the context of symbolic dynamics
- 2 On the preservation of Gibbsianness under symbol amalgamation
- 3 A note on a complex Hilbert metric with application to domain of analyticity for entropy rate of hidden Markov processes
- 4 Bounds on the entropy rate of binary hidden Markov processes
- 5 Entropy rate for hidden Markov chains with rare transitions
- 6 The capacity of finite-state channels in the high-noise regime
- 7 Computing entropy rates for hidden Markov processes
- 8 Factors of Gibbs measures for full shifts
- 9 Thermodynamics of hidden Markov processes
6 - The capacity of finite-state channels in the high-noise regime
Published online by Cambridge University Press: 05 June 2011
- Frontmatter
- Contents
- Introduction
- 1 Hidden Markov processes in the context of symbolic dynamics
- 2 On the preservation of Gibbsianness under symbol amalgamation
- 3 A note on a complex Hilbert metric with application to domain of analyticity for entropy rate of hidden Markov processes
- 4 Bounds on the entropy rate of binary hidden Markov processes
- 5 Entropy rate for hidden Markov chains with rare transitions
- 6 The capacity of finite-state channels in the high-noise regime
- 7 Computing entropy rates for hidden Markov processes
- 8 Factors of Gibbs measures for full shifts
- 9 Thermodynamics of hidden Markov processes
Summary
Abstract. This article considers the derivative of the entropy rate of a hidden Markov process with respect to the observation probabilities. The main result is a compact formula for the derivative that can be evaluated easily using Monte Carlo methods. It is applied to the problem of computing the capacity of a finite-state channel (FSC) and, in the high-noise regime, the formula has a simple closed-form expression that enables series expansion of the capacity of an FSC. This expansion is evaluated for a binary-symmetric channel under a (0, 1) run-length-limited constraint and an intersymbol-interference channel with Gaussian noise.
Introduction
The hidden Markov process
A hidden Markov process (HMP) is a discrete-time finite-state Markov chain (FSMC) observed through a memoryless channel. The HMP has become ubiquitous in statistics, computer science, and electrical engineering because it approximates many processes well using a dependency structure that leads to many efficient algorithms. While the roots of the HMP lie in the “grouped Markov chains” of Harris [20] and the “functions of a finite-state Markov chain” of Blackwell [8], the HMP first appears (in full generality) as the output process of a finite-state channel (FSC) [9]. The statistical inference algorithm of Baum and Petrie [5], however, cemented the HMP's place in history and is responsible for great advances in fields such as speech recognition and biological sequence analysis [22, 24]. An exceptional survey of HMPs, by Ephraim and Merhav, gives a nice summary of what is known in this area [12].
- Type
- Chapter
- Information
- Entropy of Hidden Markov Processes and Connections to Dynamical SystemsPapers from the Banff International Research Station Workshop, pp. 179 - 222Publisher: Cambridge University PressPrint publication year: 2011
- 5
- Cited by