Sensitivity, calibration, validation, verification

Publisher:
Elsevier
Publication Type:
Chapter
Citation:
Encyclopedia of Ecology, 2018, pp. 172-177
Issue Date:
2018-01-01
Filename Description Size
B9780124095489111790_edit_report.pdfSubmitted version183.17 kB
Adobe PDF
Full metadata record
© 2019 Elsevier B.V. Formulating the model equations and making them run is just the beginning of the modeling process. First you need to make the model output represent data as well as possible. This can be achieved in part by tweaking the parameters of the model. This is the process of model calibration. Then we need to check that the model really does what it was designed to do. This model testing may assume various procedures, and stages, some of which are called validation and verification. For example, we may want to double check that the model is based on correct assumptions, that the code has no bugs, and that the output is properly presented and interpreted. This would be the model verification stage. Or we may want to run the model on an independent set of input data and see how it performs then. That will be called the validation process in some cases. There is still some confusion on terminology and sometimes the words validation and verification are used interchangeably. In any case these are extremely important stages of model analysis that are required to prove the quality of the model, however neither of the formal methods of model analysis should be overestimated in determining the model usability. After all, the model is good as long as it helps achieve the goals of the project. The overall model performance is more important than how well it did on individual tests and comparisons.
Please use this identifier to cite or link to this item: