Lesson 25: The Moment-Generating Function Technique

Lesson 25: The Moment-Generating Function Technique

Overview

In the previous lesson, we learned that the expected value of the sample mean \(\bar{X}\) is the population mean \(\mu\). We also learned that the variance of the sample mean \(\bar{X}\) is \(\dfrac{\sigma^2}{n}\), that is, the population variance divided by the sample size \(n\). We have not yet determined the probability distribution of the sample mean when, say, the random sample comes from a normal distribution with mean \(\mu\) and variance \(\sigma^2\). We are going to tackle that in the next lesson! Before we do that, though, we are going to want to put a few more tools into our toolbox. We already have learned a few techniques for finding the probability distribution of a function of random variables, namely the distribution function technique and the change-of-variable technique. In this lesson, we'll learn yet another technique called the moment-generating function technique. We'll use the technique in this lesson to learn, among other things, the distribution of sums of chi-square random variables, Then, in the next lesson, we'll use the technique to find (finally) the probability distribution of the sample mean when the random sample comes from a normal distribution with mean \(\mu\) and variance \(\sigma^2\).

Objectives

Upon completion of this lesson, you should be able to:

  • To refresh our memory of the uniqueness property of moment-generating functions.
  • To learn how to calculate the moment-generating function of a linear combination of \(n\) independent random variables.
  • To learn how to calculate the moment-generating function of a linear combination of \(n\) independent and identically distributed random variables.
  • To learn the additive property of independent chi-square random variables.
  • To use the moment-generating function technique to prove the additive property of independent chi-square random variables.
  • To understand the steps involved in each of the proofs in the lesson.
  • To be able to apply the methods learned in the lesson to new problems.

25.1 - Uniqueness Property of M.G.F.s

25.1 - Uniqueness Property of M.G.F.s

Recall that the moment generating function:

\(M_X(t)=E(e^{tX})\)

uniquely defines the distribution of a random variable. That is, if you can show that the moment generating function of \(\bar{X}\) is the same as some known moment-generating function, then \(\bar{X}\)follows the same distribution. So, one strategy to finding the distribution of a function of random variables is:

  1. To find the moment-generating function of the function of random variables
  2. To compare the calculated moment-generating function to known moment-generating functions
  3. If the calculated moment-generating function is the same as some known moment-generating function of \(X\), then the function of the random variables follows the same probability distribution as \(X\)

Example 25-1

two pennies

In the previous lesson, we looked at an example that involved tossing a penny three times and letting \(X_1\) denote the number of heads that we get in the three tosses. In the same example, we suggested tossing a second penny two times and letting \(X_2\) denote the number of heads we get in those two tosses. We let:

\(Y=X_1+X_2\)

denote the number of heads in five tosses. What is the probability distribution of \(Y\)?

Solution

We know that:

  • \(X_1\) is a binomial random variable with \(n=3\) and \(p=\frac{1}{2}\)
  • \(X_2\) is a binomial random variable with \(n=2\) and \(p=\frac{1}{2}\)

Therefore, based on what we know of the moment-generating function of a binomial random variable, the moment-generating function of \(X_1\) is:

\(M_{X_1}(t)=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^3\)

And, similarly, the moment-generating function of \(X_2\) is:

\(M_{X_2}(t)=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^2\)

Now, because \(X_1\) and \(X_2\) are independent random variables, the random variable \(Y\) is the sum of independent random variables. Therefore, the moment-generating function of \(Y\) is:

\(M_Y(t)=E(e^{tY})=E(e^{t(X_1+X_2)})=E(e^{tX_1} \cdot e^{tX_2} )=E(e^{tX_1}) \cdot E(e^{tX_2} )\)

The first equality comes from the definition of the moment-generating function of the random variable \(Y\). The second equality comes from the definition of \(Y\). The third equality comes from the properties of exponents. And, the fourth equality comes from the expectation of the product of functions of independent random variables. Now, substituting in the known moment-generating functions of \(X_1\) and \(X_2\), we get:

\(M_Y(t)=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^3 \cdot \left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^2=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^5\)

That is, \(Y\) has the same moment-generating function as a binomial random variable with \(n=5\) and \(p=\frac{1}{2}\). Therefore, by the uniqueness properties of moment-generating functions, \(Y\) must be a binomial random variable with \(n=5\) and \(p=\frac{1}{2}\). (Of course, we already knew that!)

It seems that we could generalize the way in which we calculated, in the above example, the moment-generating function of \(Y\), the sum of two independent random variables. Indeed, we can! On the next page!


25.2 - M.G.F.s of Linear Combinations

25.2 - M.G.F.s of Linear Combinations

Theorem

If \(X_1, X_2, \ldots, X_n\) are \(n\) independent random variables with respective moment-generating functions \(M_{X_i}(t)=E(e^{tX_i})\) for \(i=1, 2, \ldots, n\), then the moment-generating function of the linear combination:

\(Y=\sum\limits_{i=1}^n a_i X_i\)

is:

\(M_Y(t)=\prod\limits_{i=1}^n M_{X_i}(a_it)\)

Proof

The proof is very similar to the calculation we made in the example on the previous page. That is:

\begin{align} M_Y(t) &= E[e^{tY}]\\ &= E[e^{t(a_1X_1+a_2X_2+\ldots+a_nX_n)}]\\ &= E[e^{a_1tX_1}]E[e^{a_2tX_2}]\ldots E[e^{a_ntX_n}]\\ &= M_{X_1}(a_1t)M_{X_2}(a_2t)\ldots M_{X_n}(a_nt)\\ &= \prod\limits_{i=1}^n M_{X_i}(a_it)\\ \end{align}

The first equality comes from the definition of the moment-generating function of the random variable \(Y\). The second equality comes from the given definition of \(Y\). The third equality comes from the properties of exponents, as well as from the expectation of the product of functions of independent random variables. The fourth equality comes from the definition of the moment-generating function of the random variables \(X_i\), for \(i=1, 2, \ldots, n\). And, the fifth equality comes from using product notation to write the product of the moment-generating functions.

While the theorem is useful in its own right, the following corollary is perhaps even more useful when dealing not just with independent random variables, but also random variables that are identically distributed — two characteristics that we get, of course, when we take a random sample.

Corollary

If \(X_1, X_2, \ldots, X_n\) are observations of a random sample from a population (distribution) with moment-generating function \(M(t)\), then:

  1. The moment generating function of the linear combination \(Y=\sum\limits_{i=1}^n X_i\) is \(M_Y(t)=\prod\limits_{i=1}^n M(t)=[M(t)]^n\).
  2. The moment generating function of the sample mean \(\bar{X}=\sum\limits_{i=1}^n \left(\dfrac{1}{n}\right) X_i\) is \(M_{\bar{X}}(t)=\prod\limits_{i=1}^n M\left(\dfrac{t}{n}\right)=\left[M\left(\dfrac{t}{n}\right)\right]^n\).

Proof

  1. use the preceding theorem with \(a_i=1\) for \(i=1, 2, \ldots, n\)
  2. use the preceding theorem with \(a_i=\frac{1}{n}\) for \(i=1, 2, \ldots, n\)

Example 25-2

Let \(X_1, X_2\), and \(X_3\) denote a random sample of size 3 from a gamma distribution with \(\alpha=7\) and \(\theta=5\). Let \(Y\) be the sum of the three random variables:

\(Y=X_1+X_2+X_3\)

What is the distribution of \(Y\)?

Solution

The moment-generating function of a gamma random variable \(X\) with \(\alpha=7\) and \(\theta=5\) is:

\(M_X(t)=\dfrac{1}{(1-5t)^7}\)

for \(t<\frac{1}{5}\). Therefore, the corollary tells us that the moment-generating function of \(Y\) is:

\(M_Y(t)=[M_{X_1}(t)]^3=\left(\dfrac{1}{(1-5t)^7}\right)^3=\dfrac{1}{(1-5t)^{21}}\)

for \(t<\frac{1}{5}\), which is the moment-generating function of a gamma random variable with \(\alpha=21\) and \(\theta=5\). Therefore, \(Y\) must follow a gamma distribution with \(\alpha=21\) and \(\theta=5\).

What is the distribution of the sample mean \(\bar{X}\)?

Solution

Again, the moment-generating function of a gamma random variable \(X\) with \(\alpha=7\) and \(\theta=5\) is:

\(M_X(t)=\dfrac{1}{(1-5t)^7}\)

for \(t<\frac{1}{5}\). Therefore, the corollary tells us that the moment-generating function of \(\bar{X}\) is:

\(M_{\bar{X}}(t)=\left[M_{X_1}\left(\dfrac{t}{3}\right)\right]^3=\left(\dfrac{1}{(1-5(t/3))^7}\right)^3=\dfrac{1}{(1-(5/3)t)^{21}}\)

for \(t<\frac{3}{5}\), which is the moment-generating function of a gamma random variable with \(\alpha=21\) and \(\theta=\frac{5}{3}\). Therefore, \(\bar{X}\) must follow a gamma distribution with \(\alpha=21\) and \(\theta=\frac{5}{3}\).


25.3 - Sums of Chi-Square Random Variables

25.3 - Sums of Chi-Square Random Variables

We'll now turn our attention towards applying the theorem and corollary of the previous page to the case in which we have a function involving a sum of independent chi-square random variables. The following theorem is often referred to as the "additive property of independent chi-squares."

Theorem

Let \(X_i\) denote \(n\) independent random variables that follow these chi-square distributions:

  • \(X_1 \sim \chi^2(r_1)\)
  • \(X_2 \sim \chi^2(r_2)\)
  • \(\vdots\)
  • \(X_n \sim \chi^2(r_n)\)

Then, the sum of the random variables:

\(Y=X_1+X_2+\cdots+X_n\)

follows a chi-square distribution with \(r_1+r_2+\ldots+r_n\) degrees of freedom. That is:

\(Y\sim \chi^2(r_1+r_2+\cdots+r_n)\)

Proof

We have shown that \(M_Y(t)\) is the moment-generating function of a chi-square random variable with \(r_1+r_2+\ldots+r_n\) degrees of freedom. That is:

\(Y\sim \chi^2(r_1+r_2+\cdots+r_n)\)

as was to be shown.

Theorem

Let \(Z_1, Z_2, \ldots, Z_n\) have standard normal distributions, \(N(0,1)\). If these random variables are independent, then:

\(W=Z^2_1+Z^2_2+\cdots+Z^2_n\)

follows a \(\chi^2(n)\) distribution.

Proof

Recall that if \(Z_i\sim N(0,1)\), then \(Z_i^2\sim \chi^2(1)\) for \(i=1, 2, \ldots, n\). Then, by the additive property of independent chi-squares:

\(W=Z^2_1+Z^2_2+\cdots+Z^2_n \sim \chi^2(1+1+\cdots+1)=\chi^2(n)\)

That is, \(W\sim \chi^2(n)\), as was to be proved.

Corollary

If \(X_1, X_2, \ldots, X_n\) are independent normal random variables with different means and variances, that is:

\(X_i \sim N(\mu_i,\sigma^2_i)\)

for \(i=1, 2, \ldots, n\). Then:

\(W=\sum\limits_{i=1}^n \dfrac{(X_i-\mu_i)^2}{\sigma^2_i} \sim \chi^2(n)\)

Proof

Recall that:

\(Z_i=\dfrac{(X_i-\mu_i)}{\sigma_i} \sim N(0,1)\)

Therefore:

\(W=\sum\limits_{i=1}^n Z^2_i=\sum\limits_{i=1}^n \dfrac{(X_i-\mu_i)^2}{\sigma^2_i} \sim \chi^2(n)\)

as was to be proved.


Legend
[1]Link
Has Tooltip/Popover
 Toggleable Visibility