Sums of Chi-Square Random Variables

Printer-friendly versionPrinter-friendly version

We'll now turn our attention towards applying the theorem and corollary of the previous page to the case in which we have a function involving a sum of independent chi-square random variables. The following theorem is often referred to as the "additive property of independent chi-squares."

Theorem.  Let Xi denote n independent random variables that follow these chi-square distributions:  

  • \(X_1 \sim \chi^2(r_1)\)
  • \(X_2 \sim \chi^2(r_2)\)
  • \(\vdots\)
  • \(X_n \sim \chi^2(r_n)\)

Then, the sum of the random variables:

\(Y=X_1+X_2+\cdots+X_n\)

follows a chi-square distribution with r1 + r2 + ... + rdegrees of freedom. That is:

\(Y\sim \chi^2(r_1+r_2+\cdots+r_n)\)

Proof.

We have shown that MY(t) is the moment-generating function of a chi-square random variable with r1 + r2 + ... + rdegrees of freedom. That is:

\(Y\sim \chi^2(r_1+r_2+\cdots+r_n)\)

as was to be shown.

Theorem. Let Z1, Z2 , ... , Zn  have standard normal distributions, N(0, 1).  If these random variables are independent, then:

\(W=Z^2_1+Z^2_2+\cdots+Z^2_n\)

follows a χ2(n) distribution.

Proof. Recall that if Zi ~ N(0, 1), then Zi2 ~  χ2(1) for i = 1, 2, ... , n.  Then, by the additive property of independent chi-squares:

\(W=Z^2_1+Z^2_2+\cdots+Z^2_n \sim \chi^2(1+1+\cdots+1)=\chi^2(n)\)

That is, W ~ χ2(n), as was to be proved.

Corollary. If  X1, X2 , ... , Xn are independent normal random variables with different means and variances, that is:

\(X_i \sim N(\mu_i,\sigma^2_i)\)

for i = 1, 2, ..., n. Then:

\(W=\sum\limits_{i=1}^n \dfrac{(X_i-\mu_i)^2}{\sigma^2_i} \sim \chi^2(n)\)

Proof. Recall that:

\(Z_i=\dfrac{(X_i-\mu_i)}{\sigma_i} \sim N(0,1)\)

Therefore:

\(W=\sum\limits_{i=1}^n Z^2_i=\sum\limits_{i=1}^n \dfrac{(X_i-\mu_i)^2}{\sigma^2_i} \sim \chi^2(n)\)

as was to be proved.