Guidance Document on the Development, Evaluation and Application of Environmental Models

From Watershed Central Wiki
Jump to: navigation, search

Experience Required: None
Time Needed for Application: Hours
Data Needs: None
Support available: Yes
Software requirements: None
Cost to purchase: Free


The EPA CREM (Council for Regulatory Environmental Modeling) developed this document to provide a simplified, comprehensive resource for modelers across the Agency on best modeling practices. When adhered to, the guidelines will help to ensure the quality, utility and regulatory relevance of the models that EPA develops and applies and the transparency of modeling analyses and model-based decisions.


Download Guidance Document Here.


Executive Summary

In pursuing its mission to protect human health and to safeguard the natural environment, the U.S. Environmental Protection Agency often relies on environmental models. In this guidance, a model is defined as a “simplification of reality that is constructed to gain insights into select attributes of a particular physical, biological, economic, or social system.” This guidance provides recommendations for the effective development, evaluation, and use of models in environmental decision making once an environmental issue has been identified. These recommendations are drawn from Agency white papers, EPA Science Advisory Board reports, the National Research Council’s Models in Environmental Regulatory Decision Making, and peer-reviewed literature. For organizational simplicity, the recommendations are categorized into three sections: model development, model evaluation, and model application.

Model development can be viewed as a process with three main steps:
(a) specify the environmental problem (or set of issues) the model is intended to address and develop the conceptual model,

(b) evaluate or develop the model framework (develop the mathematical model), and

(c) parameterize the model to develop the application tool.

Model evaluation is the process for generating information over the life cycle of the project that helps determine whether a model and its analytical results are of sufficient quality to serve as the basis for a decision. Model quality is an attribute that is meaningful only within the context of a specific model application. In simple terms, model evaluation provides information to help answer the following questions:
(a) How have the principles of sound science been addressed during model development?

(b) How is the choice of model supported by the quantity and quality of available data?

(c) How closely does the model approximate the real system of interest?

(d) How well does the model perform the specified task while meeting the objectives set by quality assurance project planning?

Model application (i.e., model-based decision making) is strengthened when the science underlying the model is transparent. The elements of transparency emphasized in this guidance are

(a) comprehensive documentation of all aspects of a modeling project (suggested as a list of elements relevant to any modeling project) and

(b) effective communication between modelers, analysts, and decision makers. This approach ensures that there is a clear rationale for using a model for a specific regulatory application.

This guidance recommends best practices to help determine when a model, despite its uncertainties, can be appropriately used to inform a decision. Specifically, it recommends that model developers and users:
(a) subject their model to credible, objective peer review;
(b) assess the quality of the data they use;
(c) corroborate their model by evaluating the degree to which it corresponds to the system being modeled; and
(d) perform sensitivity and uncertainty analyses. Sensitivity analysis evaluates the effect of changes in input values or assumptions on a model's results. Uncertainty analysis investigates the effects of lack of knowledge and other potential sources of error in the model (e.g., the “uncertainty” associated with model parameter values).

When conducted in combination, sensitivity and uncertainty analysis allow model users to be more informed about the confidence that can be placed in model results. A model’s quality to support a decision becomes better known when information is available to assess these factors.