11: Multiple Linear Regression

Printer-friendly versionPrinter-friendly version

Learning objectives for this lesson

Upon completion of this lesson, you should be able to do the following:

  • Understand the differences between simple and multiple linear regression
  • Interpret the coefficients in a multiple linear regression model
  • Conduct t−tests for the individual slope estimates
  • Learn how to include an indicator, or dummy, variable in a regression model

Multiple Linear Regression

In simple linear regression we only consider one predictor variable. When we include more than one predictor variable, we have what is now a multiple linear regression model. This new model is just an extension of the simple model where we now include parameter (i.e. slope) estimates for each predictor variable in the model. These coefficient values for each predictor are the slope estimates. As with simple linear regression, we have one Y or response variable (also called the dependent variable), but now have more than one X variable, also called explanatory, independent, or predictor variables. The Multiple Linear Regression model is as follows:

\(Y=\beta_0+\beta_1 X_1+\ldots+\beta_k X_k +\epsilon\)

Where Y is the response variable and X1, ... , Xk are independent variables. β0 , β1 , ... , βk are fixed (but unknown) parameters and ε is a random variable representing the error, or residuals, that is normally distributed with mean 0 and having a variance \(\sigma^2_\epsilon\) .