8.1 - A Definition

Example 8-1 Section

a couple of six sided dice

Toss a fair, six-sided die many times. In the long run (do you notice that it is bolded and italicized?!), what would the average (or "mean") of the tosses be? That is, if we have the following, for example:


1 3 2 4 5 6 5 4 4 1 2 3
2 1 3 6 4 5 5 4 3 1 6 6
2 5 3 1 2 4 6 2 5 6 3 1

what is the average of the tosses?



This example lends itself to a couple of notes.

  1. In reality, one-sixth of the tosses will equal \(x\) only in the long run (there's that bolding again).
  2. The mean is a weighted average, that is, an average of the values weighted by their respective individual probabilities.
  3. The mean is called the expected value of \(X\), denoted \(E(X)\) or by \(\mu\), the greek letter mu (read "mew").

Let's give a formal definition.

Mathematical Expectation

If \(f(x)\) is the p.m.f. of the discrete random variable \(X\) with support \(S\), and if the summation:

\(\sum\limits_{x\in S}u(x)f(x)\)

exists (that is, it is less than \(\infty\)), then the resulting sum is called the mathematical expectation, or the expected value of the function \(u(X)\). The expectation is denoted \(E[u(X)]\). That is:

\(E[u(X)]=\sum\limits_{x\in S}u(x)f(x)\)

Example 8-2 Section

What is the average toss of a fair six-sided die?


If the random variable \(X\) is the top face of a tossed, fair, six-sided die, then the p.m.f. of \(X\) is:


for \(x=1, 2, 3, 4, 5, \text{and } 6\). Therefore, the average toss, that is, the expected value of \(X\), is:


Hmm... if we toss a fair, six-sided die once, should we expect the toss to be 3.5? No, of course not! All the expected value tells us is what we would expect the average of a large number of tosses to be in the long run. If we toss a fair, six-sided die a thousand times, say, and calculate the average of the tosses, will the average of the 1000 tosses be exactly 3.5? No, probably not! But, we can certainly expect it to be close to 3.5. It is important to keep in mind that the expected value of \(X\) is a theoretical average, not an actual, realized one!

Example 8-3 Section

roulette wheel

Hannah's House of Gambling has a roulette wheel containing 38 numbers: zero (0), double zero (00), and the numbers 1, 2, 3, ..., 36. Let \(X\) denote the number on which the ball lands and \(u(X)\) denote the amount of money paid to the gambler, such that:

\begin{array}{lcl} u(X) &=& \$5 \text{ if } X=0\\ u(X) &=& \$10 \text{ if } X=00\\ u(X) &=& \$1 \text{ if } X \text{ is odd}\\ u(X) &=& \$2 \text{ if } X \text{ is even} \end{array}

How much would I have to charge each gambler to play in order to ensure that I made some money?


Assuming that the ball has an equally likely chance of landing on each number, the p.m.f of \(X\) is:


for \(x=0, 00, 1, 2, 3, \ldots, 36\). Therefore, the expected value of \(u(X)\) is:

\(E(u(X))=\$5\left(\dfrac{1}{38}\right)+\$10\left(\dfrac{1}{38}\right)+\left[\$1\left(\dfrac{1}{38}\right)\times 18 \right]+\left[\$2\left(\dfrac{1}{38}\right)\times 18 \right]=\$1.82\)

Note that the 18 that is multiplied by the \$1 and \$2 is because there are 18 odd and 18 even numbers on the wheel. Our calculation tells us that, in the long run, Hannah's House of Gambling would expect to have to pay out \$1.82 for each spin of the roulette wheel. Therefore, in order to ensure that the House made money, the House would have to charge at least \$1.82 per play.

Example 8-4 Section

Imagine a game in which, on any play, a player has a 20% chance of winning \$3 and an 80% chance of losing \$1. The probability mass function of the random variable \(X\), the amount won or lost on a single play is:

x \$3 -\$1
f(x) 0.2 0.8

and so the average amount won (actually lost, since it is negative) — in the long run — is:


What does "in the long run" mean? If you play, are you guaranteed to lose no more than 20 cents?


If you play and lose, you are guaranteed to lose \$1! An expected loss of 20 cents means that if you played the game over and over and over and over .... again, the average of your \$3 winnings and your \$1 losses would be a 20 cent loss. "In the long run" means that you can't draw conclusions about one or two plays, but rather thousands and thousands of plays.

Example 8-5 Section

What is the expected value of a discrete random variable \(X\) with the following probability mass function:


where \(c\) is a constant and the support is \(x=1, 2, 3, \ldots\)?


The expected value is calculated as follows:

\(E(X)=\sum\limits_{x=1}^\infty xf(x)=\sum\limits_{x=1}^\infty x\left(\dfrac{c}{x^2}\right)=c\sum\limits_{x=1}^\infty \dfrac{1}{x}\)

The first equal sign arises from the definition of the expected value. The second equal sign just involves replacing the generic p.m.f. notation \(f(x)\) with the given p.m.f. And, the third equal sign is because the constant \(c\) can be pulled through the summation sign, because it does not depend on the value of \(x\).

Now, to finalize our calculation, all we need to do is determine what the summation:

\(\sum\limits_{x=1}^\infty \dfrac{1}{x}\)

equals. Oops! You might recognize this quantity from your calculus studies as the divergent harmonic series, whose sum is infinity. Therefore, as the above definition of expectation suggests, we say in this case that the expected value of \(X\) doesn't exist.

This is the first example where the summation is not absolutely convergent. That is, we cannot get a finite answer here. The expectation for a random variable may not always exist. In this course, we will not encounter nonexistent expectations very often. However, when you encounter more sophisticated distributions in your future studies, you may find that the expectation does not exist.