Shrinkage (statistics)

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

In statistics, shrinkage has two meanings:

  • In relation to the general observation that, in regression analysis, a fitted relationship appears to perform less well on a new data set than on the data set used for fitting.[1] In particular the value of the coefficient of determination 'shrinks'. This idea is complementary to overfitting and, separately, to the standard adjustment made in the coefficient of determination to compensate for the subjunctive effects of further sampling, like controlling for the potential of new explanatory terms improving the model by chance: that is, the adjustment formula itself provides "shrinkage." But the adjustment formula yields an artificial shrinkage, in contrast to the first definition.
  • To describe general types of estimators, or the effects of some types of estimation, whereby a naive or raw estimate is improved by combining it with other information (see shrinkage estimator). The term relates to the notion that the improved estimate is at a reduced distance from the value supplied by the 'other information' than is the raw estimate. In this sense, shrinkage is used to regularize ill-posed inference problems.

A common idea underlying both of these meanings is the reduction in the effects of sampling variation.


References

  1. Everitt B.S. (2002) Cambridge Dictionary of Statistics (2nd Edition), CUP. ISBN 0-521-81099-X


<templatestyles src="Asbox/styles.css"></templatestyles>