Book contents
- Frontmatter
- Contents
- List of contributors
- Preface
- Neurons and neural networks: general principles
- Synaptic plasticity, topological and temporal features, and higher cortical processing
- Spin glass models and cellular automata
- 13 Neural networks: learning and forgetting
- 14 Learning by error corrections in spin glass models of neural networks
- 15 Random complex automata: analogy with spin glasses
- 16 The evolution of data processing abilities in competing automata
- 17 The inverse problem for neural nets and cellular automata
- Cyclic phenomena and chaos in neural networks
- The cerebellum and the hippocampus
- Olfaction, vision and cognition
- Applications to experiment, communication and control
- Author index
- Subject index
13 - Neural networks: learning and forgetting
from Spin glass models and cellular automata
Published online by Cambridge University Press: 05 February 2012
- Frontmatter
- Contents
- List of contributors
- Preface
- Neurons and neural networks: general principles
- Synaptic plasticity, topological and temporal features, and higher cortical processing
- Spin glass models and cellular automata
- 13 Neural networks: learning and forgetting
- 14 Learning by error corrections in spin glass models of neural networks
- 15 Random complex automata: analogy with spin glasses
- 16 The evolution of data processing abilities in competing automata
- 17 The inverse problem for neural nets and cellular automata
- Cyclic phenomena and chaos in neural networks
- The cerebellum and the hippocampus
- Olfaction, vision and cognition
- Applications to experiment, communication and control
- Author index
- Subject index
Summary
Introduction
Networks of formal neurons provide simple models for distributed, content addressable, fault-tolerant memory. Many numerical and analytical results have been obtained recently, especially on the Hopfield model (Hopfield, 1982). In this model, a network of a large number of fully connected neurons, with symmetric interactions, has a memory capacity which increases linearly with the size of the network – that is, in fact, with the connectivity. When the total number of stored patterns exceeds this capacity, a catastrophic deterioration occurs, as total confusion sets in. Alternative schemes have been proposed (Hopfield, 1982; Nadal et al., 1986; Parisi, 1986) that avoid overloading: new patterns can always be learned, at the expense of more anciently stored ones which get erased – for this reason, these schemes have been called palimpsest. Numerical and analytical results (Nadal et al., 1986; Mézard et al., 1986) detail the behavior of these models, which show striking analogies with the behavior of human short-term memory. We will review the main results, and point out the possible relevance for human working memory.
In Section 13.2 the origin of the catastrophic deterioration in the standard scheme (Hopfield model) is simply explained. In Section 13.3 simple modifications are shown to lead to short-term memory effects. In Section 13.4 properties of these networks are exposed in close analogy with known data of experimental psychology.
- Type
- Chapter
- Information
- Computer Simulation in Brain Science , pp. 221 - 231Publisher: Cambridge University PressPrint publication year: 1988