Overview Section
In the last two lessons, we've concerned ourselves with how two random variables \(X\) and \(Y\) behave jointly. We'll now turn to investigating how one of the random variables, say \(Y\), behaves given that another random variable, say \(X\), has already behaved in a certain way. In the discrete case, for example, we might want to know the probability that \(Y\), the number of car accidents in July on a particular curve in the road, equals 2 given that \(X\), the number of cars in June caught speeding on the curve is more than 50. Of course, our previous work investigating conditional probability will help us here. Now, we will extend the idea of conditional probability that we learned previously to the idea of finding a conditional probability distribution of a random variable \(Y\) given another random variable \(X\).
Objectives
- To learn the distinction between a joint probability distribution and a conditional probability distribution.
- To recognize that a conditional probability distribution is simply a probability distribution for a sub-population.
- To learn the formal definition of a conditional probability mass function of a discrete r.v. \(Y\) given a discrete r.v. \(X\).
- To learn how to calculate the conditional mean and conditional variance of a discrete r.v. \(Y\) given a discrete r.v. \(X\).
- To be able to apply the methods learned in the lesson to new problems.