1 - Introductory Information Theory and the Brain
Published online by Cambridge University Press: 04 May 2010
Summary
Introduction
Learning and using a new technique always takes time. Even if the question initially seems very straightforward, inevitably technicalities rudely intrude. Therefore before a researcher decides to use the methods information theory provides, it is worth finding out if these set of tools are appropriate for the task in hand.
In this chapter I will therefore provide only a few important formulae and no rigorous mathematical proofs (Cover and Thomas (1991) is excellent in this respect). Neither will I provide simple “how to” recipes (for the psychologist, even after nearly 40 years, Attneave (1959) is still a good introduction). Instead, it is hoped to provide a non-mathematical introduction to the basic concepts and, using examples from the literature, show the kind of questions information theory can be used to address. If, after reading this and the following chapters, the reader decides that the methods are inappropriate, he will have saved time. If, on the other hand, the methods seem potentially useful, it is hoped that this chapter provides a simplistic overview that will alleviate the growing pains.
What Is Information Theory?
Information theory was invented by Claude Shannon and introduced in his classic book The Mathematical Theory of Communication (Shannon and Weaver, 1949). What then is information theory? To quote three previous authors in historical order:
The “amount of information” is exactly the same concept that we talked about for years under the name “variance”. [Miller, 1956]
The technical meaning of “information” is not radically different from the everyday meaning; it is merely more precise.
- Type
- Chapter
- Information
- Information Theory and the Brain , pp. 1 - 20Publisher: Cambridge University PressPrint publication year: 2000
- 3
- Cited by