Article contents
Explaining Neural Transitions through Resource Constraints
Published online by Cambridge University Press: 24 May 2022
Abstract
One challenge in explaining neural evolution is the formal equivalence of different computational architectures. If a simple architecture suffices, why should more complex neural architectures evolve? The answer must involve the intense competition for resources under which brains operate. I show how recurrent neural networks can be favored when increased complexity allows for more efficient use of existing resources. Although resource constraints alone can drive a change, recurrence shifts the landscape of what is later evolvable. Hence organisms on either side of a transition boundary may have similar cognitive capacities but very different potential for evolving new capacities.
- Type
- Symposia Paper
- Information
- Copyright
- © The Author(s), 2022. Published by Cambridge University Press on behalf of the Philosophy of Science Association
References
- 1
- Cited by