M.G.F.s of Linear Combinations

Printer-friendly versionPrinter-friendly version
Theorem. If X1, X2, ... , Xn are n independent random variables with respective moment-generating functions \(M_{X_i}(t)=E(e^{tX_i})\) for i = 1, 2, ... , n, then the moment-generating function of the linear combination: 

\(Y=\sum\limits_{i=1}^n a_i X_i\)

is:

 \(M_Y(t)=\prod\limits_{i=1}^n M_{X_i}(a_it)\)

 Proof. The proof is very similar to the calculation we made in the example on the previous page. That is:

\begin{align}
M_Y(t) &= E[e^{tY}]\\
&= E[e^{t(a_1X_1+a_2X_2+\ldots+a_nX_n)}]\\
&= E[e^{a_1tX_1}]E[e^{a_2tX_2}]\ldots E[e^{a_ntX_n}]\\
&= M_{X_1}(a_1t)M_{X_2}(a_2t)\ldots M_{X_n}(a_nt)\\
&= \prod\limits_{i=1}^n M_{X_i}(a_it)\\
\end{align}

The first equality comes from the definition of the moment-generating function of the random variable Y. The second equality comes from the given definition of Y. The third equality comes from the properties of exponents, as well as from the expectation of the product of functions of independent random variables. The fourth equality comes from the definition of the moment-generating function of the random variables Xi, for i =1, 2, ..., n. And, the fifth equality comes from using product notation to write the product of the moment-generating functions.

While the theorem is useful in its own right, the following corollary is perhaps even more useful when dealing not just with independent random variables, but also random variables that are identically distributed — two characteristics that we get, of course, when we take a random sample.

Corollary. If X1, X2, ... , Xn are observations of a random sample from a population (distribution) with moment-generating function M(t), then:

(1) the moment generating function of the linear combination \(Y=\sum\limits_{i=1}^n X_i\) is \(M_Y(t)=\prod\limits_{i=1}^n M(t)=[M(t)]^n\).

(2) the moment generating function of the sample mean \(\bar{X}=\sum\limits_{i=1}^n \left(\dfrac{1}{n}\right) X_i\) is \(M_{\bar{X}}(t)=\prod\limits_{i=1}^n M\left(\dfrac{t}{n}\right)=\left[M\left(\dfrac{t}{n}\right)\right]^n\).

 Proof.  For (1), use the preceding theorem with ai = 1 for i = 1, 2, ..., n. And for (2), use the preceding theorem with ai = 1/n for i = 1, 2, ..., n.

Example

Let X1X2, and X3 denote a random sample of size 3 from a gamma distribution with α = 7 and θ = 5. Let Y be the sum of the three random variables:

\(Y=X_1+X_2+X_3\)

What is the distribution of Y

Solution. The moment-generating function of a gamma random variable X with α = 7 and θ = 5 is:

\(M_X(t)=\dfrac{1}{(1-5t)^7}\)

for t < 1/5. Therefore, the corollary tells us that the moment-generating function of Y is:

\(M_Y(t)=[M_{X_1}(t)]^3=\left(\dfrac{1}{(1-5t)^7}\right)^3=\dfrac{1}{(1-5t)^{21}}\)

for t < 1/5, which is the moment-generating function of a gamma random variable with α = 21 and θ = 5. Therefore, Y must follow a gamma distribution with α = 21 and θ = 5. 

What is the distribution of the sample mean \(\bar{X}\)?

Solution. Again, the moment-generating function of a gamma random variable X with α = 7 and θ = 5 is:

\(M_X(t)=\dfrac{1}{(1-5t)^7}\)

for t < 1/5. Therefore, the corollary tells us that the moment-generating function of \(\bar{X}\) is:

\(M_{\bar{X}}(t)=\left[M_{X_1}\left(\dfrac{t}{3}\right)\right]^3=\left(\dfrac{1}{(1-5(t/3))^7}\right)^3=\dfrac{1}{(1-(5/3)t)^{21}}\)

for t < 3/5, which is the moment-generating function of a gamma random variable with α = 21 and θ = 5/3. Therefore, \(\bar{X}\) must follow a gamma distribution with α = 21 and θ = 5/3.