9.5 - Multiple Regression Model

Multiple Linear Regression Model Section

In this section, we (very) briefly discuss Multiple Linear Regression. In practice, it is not usual that there is only one predictor variable. In Multiple Linear Regression, there is one quantitative response and more than one predictor or independent variable. The model will contain the constant or intercept term, \(\beta_0\), and more than one coefficient, denoted \(\beta_1, …, \beta_k\), where \(k\) is the number of predictors.

Multiple Linear Regression Model
\(Y=\beta_0+\beta_1X_1+...+\beta_kX_k+\epsilon\)

Where \(Y\) is the response variable and \(X_1, …, X_k\) are independent variables. \(\beta_0, \beta_1, …, \beta_k\) are fixed (but unknown) parameters, and \(\epsilon\) is a random variable that is normally distributed with mean 0 and having a variance \(\sigma^2_\epsilon\).

F-Test for Overall Significance Section

There is a statistical test we can use to determine the overall significance of the regression model.

The F-test in Multiple Linear Regression test the following hypotheses:

\(H_0\colon \beta_1=...=\beta_k=0\)

\(H_a\colon \text{ At least one }\beta_i\text{ is not equal to zero}\)

The test statistic for this test, denoted \(F^*\), follows an \(F\) distribution.

We will not expect you to understand Multiple Linear Regression. We included it here to show you what you may see in practice. If you are interested in learning more, you can take STAT 501: Regression Methods.