11.5 - Key Properties of a Negative Binomial Random Variable

Theorem Section

Just as we did for a geometric random variable, on this page, we present and verify four properties of a negative binomial random variable.

The probability mass function:

\(f(x)=P(X=x)=\dbinom{x-1}{r-1} (1-p)^{x-r} p^r \)

for a negative binomial random variable \(X\) is a valid p.m.f.

Proof

Before we start the "official" proof, it is helpful to take note of the sum of a negative binomial series:

\((1-w)^{-r}=\sum\limits_{k=0}^\infty \dbinom{k+r-1}{r-1} w^k\)

Now, for the proof:

Theorem Section

The moment generating function of a negative binomial random variable \(X\) is:

\(M(t)=E(e^{tX})=\dfrac{(pe^t)^r}{[1-(1-p)e^t]^r}\)

for \((1-p)e^t<1\).

Proof

As always, the moment generating function is defined as the expected value of \(e^{tX}\). In the case of a negative binomial random variable, the m.g.f. is then:

\(M(t)=E(e^{tX})=\sum\limits_{x=r}^\infty e^{tx} \dbinom{x-1}{r-1} (1-p)^{x-r} p^r \)

Now, it's just a matter of massaging the summation in order to get a working formula. We start by effectively multiplying the summands by 1, and thereby not changing the overall sum:

\(M(t)=E(e^{tX})=\sum\limits_{x=r}^\infty e^{tx} \dbinom{x-1}{r-1} (1-p)^{x-r} p^r \times \dfrac{(e^t)^r}{(e^t)^r}\)

Now, since \(p^r\) and \((e^t)^r\) do not depend on \(x\), they can be pulled through the summation. And, since the \((e^t)^r\) that remains sits in the denominator, it can get moved into the numerator by writing is as\((e^t)^{-r}\):

\(M(t)=E(e^{tX})=p^r(e^t)^r \sum\limits_{x=r}^\infty e^{tx} \dbinom{x-1}{r-1} (1-p)^{x-r} (e^t)^{-r} \)

Now, the \(p^r\) and \((e^t)^r\) can be pulled together as \((pe^t)^r\). And, \(e^{tx}\) and \((e^t)^r\) can be pulled together to get \((e^t)^{x-r}\):

\(M(t)=E(e^{tX})=(pe^t)^r \sum\limits_{x=r}^\infty \dbinom{x-1}{r-1} (1-p)^{x-r} (e^t)^{x-r} \)

And, \((1-p)^{x-r}\) and \((e^t)^{x-r}\) can be pulled together to get \([(1-p)e^t]^{x-r}\):

\(M(t)=E(e^{tX})=(pe^t)^r \sum\limits_{x=r}^\infty \dbinom{x-1}{r-1} [(1-p)e^t]^{x-r}\)

Now, let \(k=x-r\), so that \(x=k+r\). Changing the index on the summation, we get:

\(M(t)=E(e^{tX})=(pe^t)^r \sum\limits_{k=0}^\infty \dbinom{k+r-1}{r-1}[(1-p)e^t]^k\)

Now, we should be able to recognize the summation as a negative binomial series with \(w=(1-p)e^t\). Using what we know about the sum of a negative binomial series, the m.g.f. is then:

\(M(t)=E(e^{tX})=(pe^t)^r [1-(1-p)e^t]^{-r}\)

which can be rewritten as:

\(M(t)=E(e^{tX})=\dfrac{(pe^t)^r}{[1-(1-p)e^t]^r}\)

Now, recall that the m.g.f. exists only if it is finite. So, all we need to do is note when \(M(t)\) is finite. Well, that happens when \((1-p)e^t<1\), or equivalently when \(t<-\ln (1-p)\). And the proof is complete...whewwww!

Theorem Section

The mean of a negative binomial random variable \(X\) is:

\(\mu=E(X)=\dfrac{r}{p}\)

Proof

Theorem Section

The variance of a negative binomial random variable \(X\) is:

\(\sigma^2=Var(x)=\dfrac{r(1-p)}{p^2}\)

Proof

Since we used the m.g.f. to find the mean, let's use it to find the variance as well. That is, let's use:

\(\sigma^2=M''(0)-[M'(0)]^2\)

The only problem is that finding the second derivative of \(M(t)\) is even messier than the first derivative of \(M(t)\). Let me cheat a bit then. Let me leave it to you to verify that the second derivative of the m.g.f. of the negative binomial is:

\(M''(t)=r(pe^t)^r(-r-1)[1-(1-p)e^t]^{-r-2}[-(1-p)e^t]+r^2(pe^t)^{r-1}(pe^t)[1-(1-p)e^t]^{-r-1}\)

Now, with my shortcut taken, let's use it to evaluate the second derivative of the m.g.f. at \(t=0\):

Now, for the final calculation: