Proposition Section
If a moment-generating function exists for a random variable \(X\), then:
-
The mean of \(X\) can be found by evaluating the first derivative of the moment-generating function at \(t=0\). That is:
\(\mu=E(X)=M'(0)\)
-
The variance of \(X\) can be found by evaluating the first and second derivatives of the moment-generating function at \(t=0\). That is:
\(\sigma^2=E(X^2)-[E(X)]^2=M''(0)-[M'(0)]^2\)
Before we prove the above proposition, recall that \(E(X), E(X^2), \ldots, E(X^r)\) are called moments about the origin. It is for this reason, and the above proposition, that the function \(M(t)\) is called a moment-generating function. That is, \(M(t)\) generates moments! The proposition actually doesn't tell the whole story. In fact, in general the \(r^{th}\) moment about the origin can be found by evaluating the \(r^{th}\) derivative of the moment-generating function at \(t=0\). That is:
\(M^{(r)}(0)=E(X^r)\)
Now, let's prove the proposition.
Proof
We begin the proof by recalling that the moment-generating function is defined as follows:
\(M(t)=E(e^{tX})=\sum\limits_{x\in S} e^{tx} f(x)\)
And, by definition, \(M(t)\) is finite on some interval of \(t\) around 0. That tells us two things:
- Derivatives of all orders exist at \(t=0\).
- It is okay to interchange differentiation and summation.
That said, we can now work on the gory details of the proof:
Example 9-2 Section
Use the moment-generating function for a binomial random variable \(X\):
\(M(t)=[(1-p)+p e^t]^n\)
to find the mean \(\mu\) and variance \(\sigma^2\) of a binomial random variable.
Solution
Keeping in mind that we need to take the first derivative of \(M(t)\) with respect to \(t\), we get:
\(M'(t)=n[1-p+pe^t]^{n-1} (pe^t)\)
And, setting \(t=0\), we get the binomial mean \(\mu=np\):
To find the variance, we first need to take the second derivative of \(M(t)\) with respect to \(t\). Doing so, we get:
\(M''(t)=n[1-p+pe^t]^{n-1} (pe^t)+(pe^t) n(n-1)[1-p+pe^t]^{n-2} (pe^t)\)
And, setting \(t=0\), and using the formula for the variance, we get the binomial variance \(\sigma^2=np(1-p)\):
Not only can a moment-generating function be used to find moments of a random variable, it can also be used to identify which probability mass function a random variable follows.