Cochrane–Orcutt estimation

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Cochrane–Orcutt estimation is a procedure in econometrics, which adjusts a linear model for serial correlation in the error term. It is named after statisticians Donald Cochrane and Guy Orcutt.[1]

Theory

Consider the model

y_t = \alpha + X_t \beta+\varepsilon_t,\,

where y_{t} is the value of the dependent variable of interest at time t, \beta is a column vector of coefficients to be estimated, X_{t} is a row vector of explanatory variables at time t, and \varepsilon_t is the error term at time t.

If it is found via the Durbin–Watson statistic that the error term is serially correlated over time, then standard statistical inference as normally applied to regressions is invalid because standard errors are estimated with bias. To avoid this problem, the residuals must be modeled. If the process generating the residuals is found to be a stationary first-order autoregressive structure,[2] \varepsilon_t =\rho \varepsilon_{t-1}+e_t,\ |\rho| <1 , with the errors {e_t} being white noise, then the Cochrane–Orcutt procedure can be used to transform the model by taking a quasi-difference:

y_t - \rho y_{t-1} = \alpha(1-\rho)+\beta(X_t - \rho X_{t-1}) + e_t. \,

In this specification the error terms are white noise, so statistical inference is valid. Then the sum of squared residuals (the sum of squared estimates of e_t^2) is minimized with respect to (\alpha,\beta), conditional on \rho.

Estimating the autoregressive parameter

If \rho is not known, then it is estimated by first regressing the untransformed model and obtaining the residuals {\hat{\varepsilon}_t}, and regressing \hat{\varepsilon}_t on \hat{\varepsilon}_{t-1}, leading to an estimate of \rho and making the transformed regression sketched above feasible. (Note that one data point, the first, is lost in this regression.) This procedure of autoregressing estimated residuals can be done once and the resulting value of \rho can be used in the transformed y regression, or the residuals of the residuals autoregression can themselves be autoregressed in consecutive steps until no substantial change in the estimated value of \rho is observed.

It has to be noted, though, that the iterative Cochrane–Orcutt procedure might converge to a local but not global minimum of the residual sum of squares.[3][4]

See also

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.

Further reading

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links