Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
8 - Nonsmooth Functions and Subgradients
Published online by Cambridge University Press: 31 March 2022
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
Summary
Here, we define subgradients and subdifferentials of nonsmooth functions. These are a generalization of the concept of gradients for smooth functions, that can be used as the basis of algorithms. We relate subgradients to directional derivatives and to the normal cones associated with convex sets. We introduce composite nonsmooth functions that arise in regularized optimization formulations of data analysis problems and describe optimality conditions for minimizers of these functions. Finally, we describe proximal operators and the Moreau envelope, objects associated with nonsmooth functions that are the basis of algorithms for nonsmooth optimization described in the next chapter.
Keywords
- Type
- Chapter
- Information
- Optimization for Data Analysis , pp. 132 - 152Publisher: Cambridge University PressPrint publication year: 2022