Recall that one of the assumptions when building a linear regression model is that the errors are independent. This section discusses methods for dealing with dependent errors. In particular, the dependency usually appears because of a temporal component. Error terms correlated over time are said to be autocorrelated or serially correlated. When error terms are autocorrelated, some issues arise when using ordinary least squares. These problems are:
- Estimated regression coefficients are still unbiased, but they no longer have the minimum variance property.
- The MSE may seriously underestimate the true variance of the errors.
- The standard error of the regression coefficients may seriously underestimate the true standard deviation of the estimated regression coefficients.
- Statistical intervals and inference procedures are no longer strictly applicable.
We also consider the setting where a data set has a temporal component that affects the analysis.
Upon completion of this lesson, you should be able to:
- Apply autoregressive models to time series data.
- Interpret partial autocorrelation functions.
- Understand regression with autoregressive errors.
- Test first-order autocorrelation of the regression errors.
- Apply transformation methods to deal with autoregressive errors.
- Forecast using regression with autoregressive errors.
- Understand the purpose behind advanced times series methods.
Topic 2 Code Files Section
Below is a zip file that contains all the data sets used in this lesson: