Book contents
- Frontmatter
- Contents
- Preface
- Introduction
- PART I Entropy in ergodic theory
- PART II Entropy in topological dynamics
- PART III Entropy theory for operators
- 11 Measure-theoretic entropy of stochastic operators
- 12 Topological entropy of a Markov operator
- 13 Open problems in operator entropy
- Appendix A Toolbox
- Appendix B Conditional S–M–B
- List of symbols
- References
- Index
13 - Open problems in operator entropy
from PART III - Entropy theory for operators
Published online by Cambridge University Press: 07 October 2011
- Frontmatter
- Contents
- Preface
- Introduction
- PART I Entropy in ergodic theory
- PART II Entropy in topological dynamics
- PART III Entropy theory for operators
- 11 Measure-theoretic entropy of stochastic operators
- 12 Topological entropy of a Markov operator
- 13 Open problems in operator entropy
- Appendix A Toolbox
- Appendix B Conditional S–M–B
- List of symbols
- References
- Index
Summary
There are clearly some gaps in the theory of entropy for operators. At least as long as we seek for similarities with the analogous theory for dynamical systems. It can be hoped that these similarities reach further than we know.
Questions on doubly stochastic operators
In the entropy theory of doubly stochastic operators, the fundamental missing issue is a relevant information theory. The notion of operator entropy is created without reference to any reasonable notion of information function. Clearly, it is most desirable that such a function depends on the family of functions J and is defined directly on the phase space X, however, a compromise solution with this function defined on the product X × [0, 1] seems also acceptable. In any case, the static entropy should be the integral of the information function with respect to the appropriate measure (µ or µ × λ, respectively). Needless to say, the notion should coincide with the classical one for a family of characteristic functions of a partition. Of course, the best justification of this notion would be an analog (generalization) of the Shannon–McMillan–Breiman Theorem. Let us verbalize the problem:
Question 13.1.1 Is there a meaningful notion of an information function with respect to a family of functions, such that the static entropy is its integral? Does a generalization of the Shannon–McMillan–Breiman Theorem hold for doubly stochastic operators?
- Type
- Chapter
- Information
- Entropy in Dynamical Systems , pp. 344 - 346Publisher: Cambridge University PressPrint publication year: 2011