Forecast skill

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Forecast Skill (or skill score,[1] forecast skill, prediction skill) is a generic term referring to the accuracy and/or degree of association of prediction to an observation or estimate of the actual value of the predictand (ie, what is being predicted). The term 'forecast skill' can be used both quantitatively and qualitatively. In the former case, skill could be equal to a statistic describing forecast performance, such as the correlation of the forecast with observations. In the latter case, it could either refer to forecast performance according to a single metric or to the overall forecast performance based on multiple metrics. Forecast skill for single-value forecasts is commonly represented in terms of metrics such as correlation, root mean squared error, mean absolute error, relative mean absolute error, bias, and the Brier Score, among others. Probabilistic forecast skill scores may use metrics such as the Ranked Probabilistic Skill Score (RPSS) or the Continuous RPSS (CRPSS), among others. Categorical skill metrics such as the False Alarm Ratio (FAR), the Probability of Detection (POD), the Critical Success Index (CSI), and Equitable Threat Score (ETC) are also relevant for some forecasting applications. Skill is often, but not exclusively, expressed as the relative representation that compares the forecast performance of a particular forecast prediction to that of a reference, benchmark prediction—a formulation called a 'Skill Score'.

Forecast skill metric and score calculations should be made over a large enough sample of forecast-observation pairs to be statistically robust. A sample of predictions for a single predictand (eg, temperature at one location, or a single stock value) typically includes forecasts made on a number of different dates. A sample could also pool forecast-observation pairs across space, for a prediction made on a single date, as in the forecast of a weather event that is verified at many locations.

An example of a skill calculation which uses the error metric 'Mean Squared Error (MSE)' and the associated skill score is given in the table below. In this case, a perfect forecast results in a forecast skill metric of zero, and skill score value of 1.0. A forecast with equal skill to the reference forecast would have a skill score of 0.0, and a forecast which is less skillful than the reference forecast would have unbounded negative skill score values.[2][3]

Skill Metric: Mean squared error (MSE) \ \mathit{MSE} = \frac{\sum_{t=1}^N {E_t^2}}{N}
The associated Skill Score (SS) \ \mathit{SS} = 1- \frac{\mathit{MSE}_\text{forecast}}{\mathit{MSE}_\text{ref}}

A broad range of forecast metrics can be found in published and online resources (such as the Australian Bureau of Meteorology's longstanding web pages on verification, http://www.cawcr.gov.au/projects/verification/) A popular textbook and reference that discusses forecast skill is Statistical Methods in the Atmospheric Sciences.[4]

References

  1. Glossary of Meteorology, American Meteorological Society
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.