Book contents
- Frontmatter
- Contents
- Acknowledgements
- Conventions on notation
- Tour d'Horizon
- Part I Distributional networks
- Part II Artificial neural networks
- 10 Models and learning
- 11 Some particular nets
- 12 Oscillatory operation
- Part III Processing networks
- Part IV Communication networks
- Appendix 1 Spatial integrals for the telephone problem
- Appendix 2 Bandit and tax processes
- Appendix 3 Random graphs and polymer models
- References
- Index
12 - Oscillatory operation
Published online by Cambridge University Press: 23 November 2009
- Frontmatter
- Contents
- Acknowledgements
- Conventions on notation
- Tour d'Horizon
- Part I Distributional networks
- Part II Artificial neural networks
- 10 Models and learning
- 11 Some particular nets
- 12 Oscillatory operation
- Part III Processing networks
- Part IV Communication networks
- Appendix 1 Spatial integrals for the telephone problem
- Appendix 2 Bandit and tax processes
- Appendix 3 Random graphs and polymer models
- References
- Index
Summary
In Sections 11.1 to 11.3 we developed the idea of an associative memory as a device that is intrinsically dynamic, although this aspect was not emphasised in the later sections. If one is to keep the biological archetype in mind then one should also formulate a more realistic dynamic model of the neuron. In a sequence of publications W. Freeman (see the reference list) has developed a model which, while simple, captures the biological essentials with remarkable fidelity. It represents the pulse input to the neuron as driving a linear secondorder differential equation, whose output constitutes the ‘membrane potential’. The law by which this potential discharges and stimulates a pulse output implies, ultimately, a nonlinear activation function of the sigmoid form.
This dynamic view is related to the second point made at the close of Chapter 11: that it is unrealistic to assume that the absolute level of activation or pulse rate can convey any real information in a biological context. These variables are too fuzzy, even when aggregated, and a variable scaling is all the time being applied to keep activity at an optimal level. Neural signals are typically oscillatory, in an irregular kind of way, and it is the pattern of this oscillation which conveys information – by binding the relevant neural units into a joint dynamic state. This is a line of thought which Freeman has developed particularly, and that is combined with the ideas of Chapter 11 in Whittle (1998).
These are important network aspects which appear nowhere else in the text, and whose consequences we shall review in this chapter.
- Type
- Chapter
- Information
- NetworksOptimisation and Evolution, pp. 158 - 166Publisher: Cambridge University PressPrint publication year: 2007