25 The Moment-Generating Function Technique
Overview
In the previous lesson, we learned that the expected value of the sample mean \(\bar{X}\) is the population mean \(\mu\). We also learned that the variance of the sample mean \(\bar{X}\) is \(\dfrac{\sigma^2}{n}\), that is, the population variance divided by the sample size \(n\). We have not yet determined the probability distribution of the sample mean when, say, the random sample comes from a normal distribution with mean \(\mu\) and variance \(\sigma^2\). We are going to tackle that in the next lesson! Before we do that, though, we are going to want to put a few more tools into our toolbox. We already have learned a few techniques for finding the probability distribution of a function of random variables, namely the distribution function technique and the change-of-variable technique. In this lesson, we’ll learn yet another technique called the moment-generating function technique. We’ll use the technique in this lesson to learn, among other things, the distribution of sums of chi-square random variables, Then, in the next lesson, we’ll use the technique to find (finally) the probability distribution of the sample mean when the random sample comes from a normal distribution with mean \(\mu\) and variance \(\sigma^2\).
Objectives
Upon completion of this lesson, you should be able to:
- recall the uniqueness property of moment-generating functions.
- calculate the moment-generating function of a linear combination of \(n\) independent random variables.
- calculate the moment-generating function of a linear combination of \(n\) independent and identically distributed random variables.
- understand the additive property of independent chi-square random variables.
- use the moment-generating function technique to prove the additive property of independent chi-square random variables.
- understand the steps involved in each of the proofs in the lesson.
- apply the methods learned in the lesson to new problems.
25.1 Uniqueness Property of MGFs
Recall that the moment generating function:
\[ M_X(t)=E(e^{tX}) \]
uniquely defines the distribution of a random variable. That is, if you can show that the moment generating function of \(\bar{X}\) is the same as some known moment-generating function, then \(\bar{X}\)follows the same distribution. So, one strategy to finding the distribution of a function of random variables is:
- To find the moment-generating function of the function of random variables
- To compare the calculated moment-generating function to known moment-generating functions
- If the calculated moment-generating function is the same as some known moment-generating function of \(X\), then the function of the random variables follows the same probability distribution as \(X\)
Example 25.1 
In the previous lesson, we looked at an example that involved tossing a penny three times and letting \(X_1\) denote the number of heads that we get in the three tosses. In the same example, we suggested tossing a second penny two times and letting \(X_2\) denote the number of heads we get in those two tosses. We let:
\[ Y=X_1+X_2 \]
denote the number of heads in five tosses. What is the probability distribution of \(Y\)?
Solution
We know that:
- \(X_1\) is a binomial random variable with \(n=3\) and \(p=\frac{1}{2}\)
- \(X_2\) is a binomial random variable with \(n=2\) and \(p=\frac{1}{2}\)
Therefore, based on what we know of the moment-generating function of a binomial random variable, the moment-generating function of \(X_1\) is:
\[ M_{X_1}(t)=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^3 \]
And, similarly, the moment-generating function of \(X_2\) is:
\[ M_{X_2}(t)=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^2 \]
Now, because \(X_1\) and \(X_2\) are independent random variables, the random variable \(Y\) is the sum of independent random variables. Therefore, the moment-generating function of \(Y\) is:
\[ M_Y(t)=E(e^{tY})=E(e^{t(X_1+X_2)})=E(e^{tX_1} \cdot e^{tX_2} )=E(e^{tX_1}) \cdot E(e^{tX_2} ) \]
The first equality comes from the definition of the moment-generating function of the random variable \(Y\). The second equality comes from the definition of \(Y\). The third equality comes from the properties of exponents. And, the fourth equality comes from the expectation of the product of functions of independent random variables. Now, substituting in the known moment-generating functions of \(X_1\) and \(X_2\), we get:
\[ M_Y(t)=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^3 \cdot \left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^2=\left(\dfrac{1}{2}+\dfrac{1}{2} e^t\right)^5 \]
That is, \(Y\) has the same moment-generating function as a binomial random variable with \(n=5\) and \(p=\frac{1}{2}\). Therefore, by the uniqueness properties of moment-generating functions, \(Y\) must be a binomial random variable with \(n=5\) and \(p=\frac{1}{2}\). (Of course, we already knew that!)
It seems that we could generalize the way in which we calculated, in the above example, the moment-generating function of \(Y\), the sum of two independent random variables. Indeed, we can! On the next page!
25.2 MGFs of Linear Combinations
Example 25.2 Let \(X_1, X_2\), and \(X_3\) denote a random sample of size 3 from a gamma distribution with \(\alpha=7\) and \(\theta=5\). Let \(Y\) be the sum of the three random variables:
\[ Y=X_1+X_2+X_3 \]
What is the distribution of \(Y\)?
Solution
The moment-generating function of a gamma random variable \(X\) with \(\alpha=7\) and \(\theta=5\) is:
\[ M_X(t)=\dfrac{1}{(1-5t)^7} \]
for \(t<\frac{1}{5}\). Therefore, the corollary tells us that the moment-generating function of \(Y\) is:
\[ M_Y(t)=[M_{X_1}(t)]^3=\left(\dfrac{1}{(1-5t)^7}\right)^3=\dfrac{1}{(1-5t)^{21}} \]
for \(t<\frac{1}{5}\), which is the moment-generating function of a gamma random variable with \(\alpha=21\) and \(\theta=5\). Therefore, \(Y\) must follow a gamma distribution with \(\alpha=21\) and \(\theta=5\).
What is the distribution of the sample mean \(\bar{X}\)?
Solution
Again, the moment-generating function of a gamma random variable \(X\) with \(\alpha=7\) and \(\theta=5\) is:
\[ M_X(t)=\dfrac{1}{(1-5t)^7} \]
for \(t<\frac{1}{5}\). Therefore, the corollary tells us that the moment-generating function of \(\bar{X}\) is:
\[ M_{\bar{X}}(t)=\left[M_{X_1}\left(\dfrac{t}{3}\right)\right]^3=\left(\dfrac{1}{(1-5(t/3))^7}\right)^3=\dfrac{1}{(1-(5/3)t)^{21}} \]
for \(t<\frac{3}{5}\), which is the moment-generating function of a gamma random variable with \(\alpha=21\) and \(\theta=\frac{5}{3}\). Therefore, \(\bar{X}\) must follow a gamma distribution with \(\alpha=21\) and \(\theta=\frac{5}{3}\).
25.3 Sums of Chi-Square Random Variables
We’ll now turn our attention towards applying the theorem and corollary of the previous page to the case in which we have a function involving a sum of independent chi-square random variables. The following theorem is often referred to as the “additive property of independent chi-squares.”