Lesson 25: The Moment-Generating Function Technique

Printer-friendly versionPrinter-friendly version

toolsIntroduction

In the previous lesson, we learned that the expected value of the sample mean \(\bar{X}\) is the population mean μ. We also learned that the variance of the sample mean  \(\bar{X}\) is  σ2/n, that is, the population variance divided by the sample size n. We have not yet determined the probability distribution of the sample mean when, say, the random sample comes from a normal distribution with mean μ and variance σ2. We are going to tackle that in the next lesson! Before we do that, though, we are going to want to put a few more tools into our toolbox. We already have learned a few techniques for finding the probability distribution of a function of random variables, namely the distribution function technique and the change-of-variable technique. In this lesson, we'll learn yet another technique called the moment-generating function technique. We'll use the technique in this lesson to learn, among other things, the distribution of sums of chi-square random variables, Then, in the next lesson, we'll use the technique to find (finally) the probability distribution of the sample mean when the random sample comes from a normal distribution with mean μ and variance σ2.

Objectives

  • To refresh our memory of the uniqueness property of moment-generating functions.
  • To learn how to calculate the moment-generating function of a linear combination of n independent random variables.
  • To learn how to calculate the moment-generating function of a linear combination of n independent and identically distributed random variables.
  • To learn the additive property of independent chi-square random variables.
  • To use the moment-generating function technique to prove the additive property of independent chi-square random variables.
  • To understand the steps involved in each of the proofs in the lesson.
  • To be able to apply the methods learned in the lesson to new problems.