Book contents
- Frontmatter
- Contents
- Preface
- Dedication
- 1 Introduction
- 2 The Basic Attractor Neural Network
- 3 General Ideas Concerning Dynamics
- 4 Symmetric Neural Networks at Low Memory Loading
- 5 Storage and Retrieval of Temporal Sequences
- 6 Storage Capacity of ANN's
- 7 Robustness - Getting Closer to Biology
- 8 Memory Data Structures
- 9 Learning
- 10 Hardware Implementations of Neural Networks
- Glossary
- Index
6 - Storage Capacity of ANN's
Published online by Cambridge University Press: 05 August 2012
- Frontmatter
- Contents
- Preface
- Dedication
- 1 Introduction
- 2 The Basic Attractor Neural Network
- 3 General Ideas Concerning Dynamics
- 4 Symmetric Neural Networks at Low Memory Loading
- 5 Storage and Retrieval of Temporal Sequences
- 6 Storage Capacity of ANN's
- 7 Robustness - Getting Closer to Biology
- 8 Memory Data Structures
- 9 Learning
- 10 Hardware Implementations of Neural Networks
- Glossary
- Index
Summary
Motivation and general considerations
Different measures of storage capacity
The properties of ANN's described in the preceding chapters should make them interesting candidates both for models of some brain functions as well as for technical applications in certain areas of computer development or artificial intelligence. In either case, one of the first questions that comes to mind is the storage capacity of such systems, namely the quantity of information that can be stored and effectively retrieved from the network. It is of primary interest to know, for example, how the number of possible memories, in single patterns or in sequences, varies with the number of elements, neurons and synapses, of the network.
The storage capacity of a network can be quantified in a number of possible ways. It must be expressed per unit network element. Here we mention a few such possible measures:
The number of stored bits per neuron.
The number of stored bits per synapse.
The number of stored patterns per neuron.
The number of stored patterns per synapse.
The number of stored bits per coded synaptic bit.
Any one of the items in this list must be supplemented by informational qualifications. It should be realized that the usefulness of any of these three quantifications is strongly dependent on the level of correlation between the stored bits.
- Type
- Chapter
- Information
- Modeling Brain FunctionThe World of Attractor Neural Networks, pp. 271 - 344Publisher: Cambridge University PressPrint publication year: 1989