9.4 - Moment Generating Functions

Moment generating functions (mgfs) are function of \(t\). You can find the mgfs by using the definition of expectation of function of a random variable. The moment generating function of \(X\) is

\(M_X(t)=E\left[e^{tX}\right]=E\left[\text{exp}(tX)\right] \)

Note that \(\exp(X)\) is another way of writing \(e^X\).

Besides helping to find moments, the moment generating function has an important property often called the uniqueness property. The uniqueness property means that, if the mgf exists for a random variable, then there one and only one distribution associated with that mgf. Therefore, the mgf uniquely determines the distribution of a random variable.

This property of the mgf is sometimes referred to as the uniqueness property of the mgf.

Suppose we have the following mgf for a random variable \(Y\)

\(M_Y(t)=\dfrac{e^t}{4-3e^t}, \;\; t<-\ln(0.75)\)

Using the information in this section, we can find the \(E(Y^k)\) for any \(k\) if the expectation exists. Lets find \(E(Y)\) and \(E(Y^2)\).

We can solve these in a couple of ways.

  1. We can use the knowledge that \(M^\prime(0)=E(Y)\) and \(M^{\prime\prime}(0)=E(Y^2)\). Then we can find variance by using \(Var(Y)=E(Y^2)-E(Y)^2\). This is left as an exercise below.
  2. We can recognize that this is a moment generating function for a Geometric random variable with \(p=\frac{1}{4}\). It is also a Negative Binomial random variable with \(r=1\) and \(p=\frac{1}{4}\). Since it is a negative binomial random variable, we know \(E(Y)=\mu=\frac{r}{p}=\frac{1}{\frac{1}{4}}=4\) and \(Var(Y)=\frac{r(1-p)}{p^2}=12\). We can use the formula \(Var(Y)=E(Y^2)-E(Y)^2\) to find \(E(Y^2)\) by

\(E(Y^2)=Var(Y)+E(Y)^2=12+(4)^2=12+16=28\)

Additional Practice Problems Section

  1. Let \(X\) be a binomial random variable with parameters \(n\) and \(p\). What value of \(p\) maximizes \(P(X=k)\) for \(k=0, 1, \ldots, n\)? This is an example of a statistical method used to estimate \(p\) when a binomial random variable is equal to \(k\). If we assume that \(n\) is known, then we estimate \(p\) by choosing the value of \(p\) that maximizes \(f_X(k)=P(X=k)\). This is known as the method of maximum likelihood estimates. Maximum likelihood estimates are discussed in more detail in STAT 415. When we are trying to find the maximum with respect to \(p\) it often helps to find the maximum of the natural log of \(f_X(k)\).
    NOTE! Statisticians use the notation of \(\log\) when we are referring to \(\ln\) or \(\log_e\).

    \begin{align} P(X=x)&=f_X(x)={n\choose x}p^x(1-p)^{n-x}\\ \ln f_X(x)&=\ln {n\choose x}+x\ln p +(n-x)\ln (1-p)\\ \ell&=\frac{\partial \ln f_X(x)}{\partial p}=\frac{x}{p}-\frac{n-p}{1-p}=\frac{(1-p)x-p(n-x)}{p(1-p)}\\ \qquad \Rightarrow 0&=\frac{(1-p)x-p(n-x)}{p(1-p)},\Rightarrow 0=x(1-p)-p(n-x)=x-xp-np+xp=x-np\\ \qquad \Rightarrow x&=np, \Rightarrow \hat{p}=\frac{x}{n}\end{align}

    We use \(\hat{p}\) to denote the estimate of \(p\). This estimate makes sense. If \(X\) is the number of successes out of \(n\) trials, then a good estimate of \(p=P(\text{success})\) would be the number of successes out of the total number of trials.

  2. Suppose that \(Y\) has the following mgf.

    \(M_Y(t)=\dfrac{e^t}{4-3e^t}, \;\; t<-\ln(0.75)\)

    1. Find \(E(Y)\).

      \(\begin{array}{l}M^{\prime}(t)=e^t\left(4-3e^t\right)^{-1}+3e^{2t}\left(4-3e^t\right)^{-2}\\ E(Y)=M^{\prime}(0)=1+3=4\end{array}\)

    2. Find \(E(Y^2)\).

      \(\begin{array}{1}M^{\prime\prime}(t)=e^t(4-3e^t)^{-1}+3e^{2t}(4-3e^t)^{-2}+6e^{2t}(4-3e^t)^{-2}+18e^{3t}(4-3e^t)^{-3}\\ E(Y^2)=M^{\prime\prime}(0)=1+3+6+18=28\end {array}\)