Book contents
- Frontmatter
- Contents
- List of figures
- List of tables
- Preface
- Foreword
- Acknowledgments
- 1 Introduction and motivation
- 2 Stochastic resonance: its definition, history, and debates
- 3 Stochastic quantization
- 4 Suprathreshold stochastic resonance: encoding
- 5 Suprathreshold stochastic resonance: large N encoding
- 6 Suprathreshold stochastic resonance: decoding
- 7 Suprathreshold stochastic resonance: large N decoding
- 8 Optimal stochastic quantization
- 9 SSR, neural coding, and performance tradeoffs
- 10 Stochastic resonance in the auditory system
- 11 The future of stochastic resonance and suprathreshold stochastic resonance
- Appendix 1 Suprathreshold stochastic resonance
- Appendix 2 Large N suprathreshold stochastic resonance
- Appendix 3 Suprathreshold stochastic resonance decoding
- References
- List of abbreviations
- Index
- Biographies
5 - Suprathreshold stochastic resonance: large N encoding
Published online by Cambridge University Press: 23 October 2009
- Frontmatter
- Contents
- List of figures
- List of tables
- Preface
- Foreword
- Acknowledgments
- 1 Introduction and motivation
- 2 Stochastic resonance: its definition, history, and debates
- 3 Stochastic quantization
- 4 Suprathreshold stochastic resonance: encoding
- 5 Suprathreshold stochastic resonance: large N encoding
- 6 Suprathreshold stochastic resonance: decoding
- 7 Suprathreshold stochastic resonance: large N decoding
- 8 Optimal stochastic quantization
- 9 SSR, neural coding, and performance tradeoffs
- 10 Stochastic resonance in the auditory system
- 11 The future of stochastic resonance and suprathreshold stochastic resonance
- Appendix 1 Suprathreshold stochastic resonance
- Appendix 2 Large N suprathreshold stochastic resonance
- Appendix 3 Suprathreshold stochastic resonance decoding
- References
- List of abbreviations
- Index
- Biographies
Summary
This chapter discusses the behaviour of the mutual information and channel capacity in the suprathreshold stochastic resonance model as the number of threshold elements becomes large or approaches infinity. The results in Chapter 4 indicate that the mutual information and channel capacity might converge to simple expressions of N in the case of large N. The current chapter finds that accurate approximations do indeed exist in the large N limit. Using a relationship between mutual information and Fisher information, it is shown that capacity is achieved either (i) when the signal distribution is Jeffrey's prior, a distribution which is entirely dependent on the noise distribution, or (ii) when the noise distribution depends on the signal distribution via a cosine relationship. These results provide theoretical verification and justification for previous work in both computational neuroscience and electronics.
Introduction
Section 4.4 of Chapter 4 presents results for the mutual information and channel capacity through the suprathreshold stochastic resonance (SSR) model shown in Fig. 4.1. Recall that σ is the ratio of the noise standard deviation to the signal standard deviation. For the case of matched signal and noise distributions and a large number of threshold devices, N, the optimal value of σ – that is, the value of σ that maximizes the mutual information and achieves channel capacity – appears to asymptotically approach a constant value with increasing N. This indicates that analytical expressions might exist in the case of large N for the optimal noise intensity and channel capacity.
- Type
- Chapter
- Information
- Stochastic ResonanceFrom Suprathreshold Stochastic Resonance to Stochastic Signal Quantization, pp. 120 - 166Publisher: Cambridge University PressPrint publication year: 2008