25.2 - M.G.F.s of Linear Combinations

Theorem Section

If \(X_1, X_2, \ldots, X_n\) are \(n\) independent random variables with respective moment-generating functions \(M_{X_i}(t)=E(e^{tX_i})\) for \(i=1, 2, \ldots, n\), then the moment-generating function of the linear combination:

\(Y=\sum\limits_{i=1}^n a_i X_i\)

is:

\(M_Y(t)=\prod\limits_{i=1}^n M_{X_i}(a_it)\)

Proof

The proof is very similar to the calculation we made in the example on the previous page. That is:

\begin{align} M_Y(t) &= E[e^{tY}]\\ &= E[e^{t(a_1X_1+a_2X_2+\ldots+a_nX_n)}]\\ &= E[e^{a_1tX_1}]E[e^{a_2tX_2}]\ldots E[e^{a_ntX_n}]\\ &= M_{X_1}(a_1t)M_{X_2}(a_2t)\ldots M_{X_n}(a_nt)\\ &= \prod\limits_{i=1}^n M_{X_i}(a_it)\\ \end{align}

The first equality comes from the definition of the moment-generating function of the random variable \(Y\). The second equality comes from the given definition of \(Y\). The third equality comes from the properties of exponents, as well as from the expectation of the product of functions of independent random variables. The fourth equality comes from the definition of the moment-generating function of the random variables \(X_i\), for \(i=1, 2, \ldots, n\). And, the fifth equality comes from using product notation to write the product of the moment-generating functions.

While the theorem is useful in its own right, the following corollary is perhaps even more useful when dealing not just with independent random variables, but also random variables that are identically distributed — two characteristics that we get, of course, when we take a random sample.

Corollary

If \(X_1, X_2, \ldots, X_n\) are observations of a random sample from a population (distribution) with moment-generating function \(M(t)\), then:

  1. The moment generating function of the linear combination \(Y=\sum\limits_{i=1}^n X_i\) is \(M_Y(t)=\prod\limits_{i=1}^n M(t)=[M(t)]^n\).
  2. The moment generating function of the sample mean \(\bar{X}=\sum\limits_{i=1}^n \left(\dfrac{1}{n}\right) X_i\) is \(M_{\bar{X}}(t)=\prod\limits_{i=1}^n M\left(\dfrac{t}{n}\right)=\left[M\left(\dfrac{t}{n}\right)\right]^n\).

Proof

  1. use the preceding theorem with \(a_i=1\) for \(i=1, 2, \ldots, n\)
  2. use the preceding theorem with \(a_i=\frac{1}{n}\) for \(i=1, 2, \ldots, n\)

Example 25-2 Section

Let \(X_1, X_2\), and \(X_3\) denote a random sample of size 3 from a gamma distribution with \(\alpha=7\) and \(\theta=5\). Let \(Y\) be the sum of the three random variables:

\(Y=X_1+X_2+X_3\)

What is the distribution of \(Y\)?

Solution

The moment-generating function of a gamma random variable \(X\) with \(\alpha=7\) and \(\theta=5\) is:

\(M_X(t)=\dfrac{1}{(1-5t)^7}\)

for \(t<\frac{1}{5}\). Therefore, the corollary tells us that the moment-generating function of \(Y\) is:

\(M_Y(t)=[M_{X_1}(t)]^3=\left(\dfrac{1}{(1-5t)^7}\right)^3=\dfrac{1}{(1-5t)^{21}}\)

for \(t<\frac{1}{5}\), which is the moment-generating function of a gamma random variable with \(\alpha=21\) and \(\theta=5\). Therefore, \(Y\) must follow a gamma distribution with \(\alpha=21\) and \(\theta=5\).

What is the distribution of the sample mean \(\bar{X}\)?

Solution

Again, the moment-generating function of a gamma random variable \(X\) with \(\alpha=7\) and \(\theta=5\) is:

\(M_X(t)=\dfrac{1}{(1-5t)^7}\)

for \(t<\frac{1}{5}\). Therefore, the corollary tells us that the moment-generating function of \(\bar{X}\) is:

\(M_{\bar{X}}(t)=\left[M_{X_1}\left(\dfrac{t}{3}\right)\right]^3=\left(\dfrac{1}{(1-5(t/3))^7}\right)^3=\dfrac{1}{(1-(5/3)t)^{21}}\)

for \(t<\frac{3}{5}\), which is the moment-generating function of a gamma random variable with \(\alpha=21\) and \(\theta=\frac{5}{3}\). Therefore, \(\bar{X}\) must follow a gamma distribution with \(\alpha=21\) and \(\theta=\frac{5}{3}\).