Book contents
- Frontmatter
- Contents
- List of illustrations
- Preface
- 1 Introduction
- 2 Problems, algorithms, and solutions
- 3 Transformation of problems
- Part I Linear simultaneous equations
- Part II Non-linear simultaneous equations
- Part III Unconstrained optimization
- 9 Case studies
- 10 Algorithms
- 11 Solution of the case studies
- Part IV Equality-constrained optimization
- Part V Inequality-constrained optimization
- References
- Index
11 - Solution of the case studies
Published online by Cambridge University Press: 03 December 2009
- Frontmatter
- Contents
- List of illustrations
- Preface
- 1 Introduction
- 2 Problems, algorithms, and solutions
- 3 Transformation of problems
- Part I Linear simultaneous equations
- Part II Non-linear simultaneous equations
- Part III Unconstrained optimization
- 9 Case studies
- 10 Algorithms
- 11 Solution of the case studies
- Part IV Equality-constrained optimization
- Part V Inequality-constrained optimization
- References
- Index
Summary
In this chapter, we apply algorithms from Chapter 10 to the two case studies from Chapter 9. We consider the multi-variate linear regression case study in Section 11.1 and the power system state estimation case study in Section 11.2. Both of our case studies will be transformed into least-squares problems. Unconstrained optimization algorithms that exploit the special characteristics of leastsquares problems are described in [45, section 4.7] [84, chapter 13]; however, we will first apply our basic unconstrained optimization algorithm to these problems because in later chapters we will need to solve more general unconstrained problems.
In practice, a special purpose algorithm for least-squares problems can be expected to yield better performance on least-squares problems compared to the performance of a general purpose algorithm for unconstrained problems. That is, as we have discussed previously in Section 2.3.2, we should always in practice try to find the most specifically applicable algorithm for a problem [84, section 13.1]. We will consider such specific algorithms for least-squares problems using further transformations.
Multi-variate linear regression
In Section 11.1.1 we transform the objective of Problem (9.7) and in Section 11.1.2 we compare the transformed and original problem. In Sections 11.1.3 and 11.1.4 we calculate the derivatives of the transformed objective and present the optimality conditions. In Section 11.1.5, we transform the problem further to avoid numerical ill-conditioning issues. Then, in Section 11.1.6, we relate the optimality conditions to linear regression.
- Type
- Chapter
- Information
- Applied OptimizationFormulation and Algorithms for Engineering Systems, pp. 425 - 444Publisher: Cambridge University PressPrint publication year: 2006