Book contents
- Frontmatter
- Contents
- List of Contributors
- Preface
- 1 Introductory Information Theory and the Brain
- Part One Biological Networks
- Part Two Information Theory and Artificial Networks
- 5 Experiments with Low-Entropy Neural Networks
- 6 The Emergence of Dominance Stripes and Orientation Maps in a Network of Firing Neurons
- 7 Dynamic Changes in Receptive Fields Induced by Cortical Reorganization
- 8 Time to Learn About Objects
- 9 Principles of Cortical Processing Applied to and Motivated by Artificial Object Recognition
- 10 Performance Measurement Based on Usable Information
- Part Three Information Theory and Psychology
- Part Four Formal Analysis
- Bibliography
- Index
10 - Performance Measurement Based on Usable Information
from Part Two - Information Theory and Artificial Networks
Published online by Cambridge University Press: 04 May 2010
- Frontmatter
- Contents
- List of Contributors
- Preface
- 1 Introductory Information Theory and the Brain
- Part One Biological Networks
- Part Two Information Theory and Artificial Networks
- 5 Experiments with Low-Entropy Neural Networks
- 6 The Emergence of Dominance Stripes and Orientation Maps in a Network of Firing Neurons
- 7 Dynamic Changes in Receptive Fields Induced by Cortical Reorganization
- 8 Time to Learn About Objects
- 9 Principles of Cortical Processing Applied to and Motivated by Artificial Object Recognition
- 10 Performance Measurement Based on Usable Information
- Part Three Information Theory and Psychology
- Part Four Formal Analysis
- Bibliography
- Index
Summary
Qualitative measures show that an existing artificial neural network can perform invariant object recognition. Quantification of the level of performance of cells within this network, however, is shown to be problematic.
In line with contemporary neurophysiological analyses (e.g. Optican and Richmond, 1987; Tovee et al., 1993), a simplistic form of Shannon's information theory was applied to this performance measurement task. However, the results obtained are shown not to be useful – the perfect reliability of artificial cell responses highlights the implicit decoding power of pure Shannon information theory.
Refinement of the definition of cell performance in terms of usable Shannon information (Shannon information which is available in a “useful” form) leads to the development of two novel performance measures. First, a cell's “information trajectory” quantifies standard information-theoretic performance across a range of decoders of increasing complexity – information made available using simple decoding is weighted more strongly than information only available using more complex decoding. Second, the nature of the application (the task the network attempts to solve) is used to design a decoder of appropriate complexity, leading to an exceptionally simple and reliable information-theoretic measure. Comparison of the various measures' performance in the original problem domain show the superiority of the second novel measure.
The chapter concludes with the observation that reliable application of Shannon's information theory requires close consideration of the form in which signals may be decoded – in short, not all measurable information may be usable information.
Introduction
This chapter discusses an approach to performance measurement using information theory in the context of a model of invariant object recognition. Each of these terms is discussed in turn in the following sections.
- Type
- Chapter
- Information
- Information Theory and the Brain , pp. 180 - 200Publisher: Cambridge University PressPrint publication year: 2000
- 1
- Cited by