Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
9 - Nonsmooth Optimization Methods
Published online by Cambridge University Press: 31 March 2022
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Foundations of Smooth Optimization
- 3 Descent Methods
- 4 Gradient Methods Using Momentum
- 5 Stochastic Gradient
- 6 Coordinate Descent
- 7 First-Order Methods for Constrained Optimization
- 8 Nonsmooth Functions and Subgradients
- 9 Nonsmooth Optimization Methods
- 10 Duality and Algorithms
- 11 Differentiation and Adjoints
- Appendix
- Bibliography
- Index
Summary
Here, we describe algorithms for minimizing nonsmooth functions and composite nonsmooth functions, which are the sum of a smooth function and a (usually elementary) nonsmooth function. We start with the subgradient descent method, whose search direction is the minimum-norm element of the subgradient. We then discuss the subgradient method, which steps along an arbitrary direction drawn from the subdifferential. Next, we describe proximal-gradient algorithms for nonsmooth composite optimization, which make use of the gradient of the smooth part of the function and the proximal operator associated with the nonsmooth part. Finally, we describe the proximal point method, a framework optimization that is valuable both as a fundamental method in its own right and as a building block for the augmented Lagrangian approach described in the next chapter.
Keywords
- Type
- Chapter
- Information
- Optimization for Data Analysis , pp. 153 - 169Publisher: Cambridge University PressPrint publication year: 2022