Book contents
- Frontmatter
- Contents
- Foreword
- Introduction
- Part I Steps to a deterministic interpretation of chaotic signals
- Part II The ergodic theory of chaos
- 7 Invariant probability measures
- 8 Physical measures
- 9 Characteristic exponents
- 10 Invariant manifolds
- 11 Axiom A and structural stability
- 12 Entropy
- 13 Dimensions
- 14 Resonances
- 15 Conclusions
- References
- Bibliography
- Index
- Frontmatter
- Contents
- Foreword
- Introduction
- Part I Steps to a deterministic interpretation of chaotic signals
- Part II The ergodic theory of chaos
- 7 Invariant probability measures
- 8 Physical measures
- 9 Characteristic exponents
- 10 Invariant manifolds
- 11 Axiom A and structural stability
- 12 Entropy
- 13 Dimensions
- 14 Resonances
- 15 Conclusions
- References
- Bibliography
- Index
Summary
From previous discussions we know that there exist deterministic dynamical systems where trajectories emerging from nearby initial conditions diverge exponentially. Due to this sensitivity any uncertainty about seemingly insignificant digits in the sequence of numbers which defines an initial condition, spreads with time towards the significant digits, leading to chaotic behavior. Therefore there is a change in the information we have about the state of the system. This change can be thought of as a creation of information if we consider that two initial conditions that are different but indistinguishable (within a certain precision), evolve into distinguishable states after a finite time.
If f is a transformation preserving a measure ρ, then the Kolmogorov-Sinai invariant, or entropy, denoted by h (ρ), measures the asymptotic rate of creation of information by iterating f
This concept was introduced by Kolmogorov in 1958, when looking for a number which was invariant under isomorphisms. The problem of deciding when two measure-preserving transformations are equivalent, or isomorphic, is one of the fundamental problems of abstract ergodic theory. For instance, the problem of whether the two Bernoulli shifts {½, ½} and {⅓, ⅓, ⅓} are equivalent obtained no solution for many years, until it was shown that they have different entropies and hence they are nonisomorphic.
A notion of entropy (which can be applied to shifts) was first introduced by Shannon in 1948, in a famous work which originated information theory. Suppose we perform an experiment with m possible outcomes, for example rolling a die with m faces. Let P1,P2, …, Pm be the probabilities of the different outcomes.
- Type
- Chapter
- Information
- Chaotic Evolution and Strange Attractors , pp. 69 - 77Publisher: Cambridge University PressPrint publication year: 1989