Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- Part I Fundamental concepts
- Part II Code verification
- Part III Solution verification
- Part IV Model validation and prediction
- Part V Planning, management, and implementation issues
- 14 Planning and prioritization in modeling and simulation
- 15 Maturity assessment of modeling and simulation
- 16 Development and responsibilities for verification, validation and uncertainty quantification
- Appendix Programming practices
- Index
- Plate Section
- References
15 - Maturity assessment of modeling and simulation
from Part V - Planning, management, and implementation issues
Published online by Cambridge University Press: 05 March 2013
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- Part I Fundamental concepts
- Part II Code verification
- Part III Solution verification
- Part IV Model validation and prediction
- Part V Planning, management, and implementation issues
- 14 Planning and prioritization in modeling and simulation
- 15 Maturity assessment of modeling and simulation
- 16 Development and responsibilities for verification, validation and uncertainty quantification
- Appendix Programming practices
- Index
- Plate Section
- References
Summary
In Chapter 1, Introduction, we briefly discussed how credibility is built in modeling and simulation (M&S). The four elements mentioned in that chapter were: quality of the analysts conducting the analysis, quality of the physics modeling, verification and validation activities, and uncertainty quantification and sensitivity analysis. The latter three elements are technical elements that can be assessed for completeness or maturity. Assessment of maturity is important to the staff conducting the modeling and simulation effort, but it is critically important for project managers and decision makers who use computational results as an element in their decision making. It is also important for internal or external review committees who are asked to provide recommendations on the credibility and soundness of computational analyses. This chapter deals with reviewing methods that have been developed for assessing similar activities, and then presents a newly developed technique reported in Oberkampf et al. (2007). This chapter is taken in large part from this reference.
Survey of maturity assessment procedures
Over the last decade, a number of researchers have investigated how to measure the maturity and credibility of software and hardware development processes and products. Probably the best-known procedure for measuring the maturity of software product development and business processes is the Capability Maturity Model Integration (CMMI). The CMMI is a successor to the Capability Maturity Model (CMM). Development of the CMM was initiated in 1987 to improve software quality. For an extensive discussion of the framework and methods for the CMMI, see West (2004); Ahern et al. (2005); Garcia and Turner (2006); and Chrissis et al. (2007).
- Type
- Chapter
- Information
- Verification and Validation in Scientific Computing , pp. 696 - 727Publisher: Cambridge University PressPrint publication year: 2010
References
- 1
- Cited by