On this page, we'll just take a look at a few examples that use the material and methods we learned about in this lesson.
Example 24-4 Section
If \(X_1,X_2,\ldots, X_n\) are a random sample from a population with mean \(\mu\) and variance \(\sigma^2\), then what is:
\(E[(X_i-\mu)(X_j-\mu)]\)
for \(i\ne j\), \(i=1, 2, \ldots, n\)?
Solution
The fact that \(X_1,X_2,\ldots, X_n\) constitute a random sample tells us that (1) \(X_i\) is independent of \(X_j\), for all \(i\ne j\), and (2) the \(X_i\) are identically distributed. Now, we know from our previous work that if \(X_i\) is independent of \(X_j\), for \(i\ne j\), then the covariance between \(X_i\) is independent of \(X_j\) is 0. That is:
\(E[(X_i-\mu)(X_j-\mu)]=Cov(X_i,X_j)=0\)
Example 24-5 Section
Let \(X_1, X_2, X_3\) be a random sample of size \(n=3\) from a distribution with the geometric probability mass function:
\(f(x)=\left(\dfrac{3}{4}\right) \left(\dfrac{1}{4}\right)^{x-1}\)
for \(x=1, 2, 3, \ldots\). What is \(P(\max X_i\le 2)\)?
Solution
The only way that the maximum of the \(X_i\) will be less than or equal to 2 is if all of the \(X_i\) are less than or equal to 2. That is:
\(P(\max X_i\leq 2)=P(X_1\leq 2,X_2\leq 2,X_3\leq 2)\)
Now, because \(X_1,X_2,X_3\) are a random sample, we know that (1) \(X_i\) is independent of \(X_j\), for all \(i\ne j\), and (2) the \(X_i\) are identically distributed. Therefore:
\(P(\max X_i\leq 2)=P(X_1\leq 2)P(X_2\leq 2)P(X_3\leq 2)=[P(X_1\leq 2)]^3\)
The first equality comes from the independence of the \(X_i\), and the second equality comes from the fact that the \(X_i\) are identically distributed. Now, the probability that \(X_1\) is less than or equal to 2 is:
\(P(X\leq 2)=P(X=1)+P(X=2)=\left(\dfrac{3}{4}\right) \left(\dfrac{1}{4}\right)^{1-1}+\left(\dfrac{3}{4}\right) \left(\dfrac{1}{4}\right)^{2-1}=\dfrac{3}{4}+\dfrac{3}{16}=\dfrac{15}{16}\)
Therefore, the probability that the maximum of the \(X_i\) is less than or equal to 2 is:
\(P(\max X_i\leq 2)=[P(X_1\leq 2)]^3=\left(\dfrac{15}{16}\right)^3=0.824\)