Book contents
- Frontmatter
- Contents
- Contributors
- Preface
- 1 The Modern Mathematics of Deep Learning
- 2 Generalization in Deep Learning
- 3 Expressivity of Deep Neural Networks
- 4 Optimization Landscape of Neural Networks
- 5 Explaining the Decisions of Convolutional and Recurrent Neural Networks
- 6 Stochastic Feedforward Neural Networks: Universal Approximation
- 7 Deep Learning as Sparsity-Enforcing Algorithms
- 8 The Scattering Transform
- 9 Deep Generative Models and Inverse Problems
- 10 Dynamical Systems andOptimal Control Approach to Deep Learning
- 11 Bridging Many-Body Quantum Physics and Deep Learning via Tensor Networks
3 - Expressivity of Deep Neural Networks
Published online by Cambridge University Press: 29 November 2022
- Frontmatter
- Contents
- Contributors
- Preface
- 1 The Modern Mathematics of Deep Learning
- 2 Generalization in Deep Learning
- 3 Expressivity of Deep Neural Networks
- 4 Optimization Landscape of Neural Networks
- 5 Explaining the Decisions of Convolutional and Recurrent Neural Networks
- 6 Stochastic Feedforward Neural Networks: Universal Approximation
- 7 Deep Learning as Sparsity-Enforcing Algorithms
- 8 The Scattering Transform
- 9 Deep Generative Models and Inverse Problems
- 10 Dynamical Systems andOptimal Control Approach to Deep Learning
- 11 Bridging Many-Body Quantum Physics and Deep Learning via Tensor Networks
Summary
In this chapter, we give a comprehensive overview of the large variety of approximation results for neural networks. Approximation rates for classical function spaces as well as the benefits of deep neural networks over shallow ones for specifically structured function classes are discussed. While the main body of existing results is for general feedforward architectures, we also review approximation results for convolutional, residual and recurrent neural networks.
- Type
- Chapter
- Information
- Mathematical Aspects of Deep Learning , pp. 149 - 199Publisher: Cambridge University PressPrint publication year: 2022
- 5
- Cited by