# 2.1 - Normal and Chi-Square Approximations

## Central Limit Theorem Section

Recall the bell-shaped (standard) normal distribution with mean 0 and variance 1.

Although continuous in nature, it can still be a useful approximation for many discrete random variables formed from sums and means.

##### Central Limit Theorem

The Central Limit Theorem (CLT) states that if $$X_1,\ldots,X_n$$ are a random sample from a distribution with mean $$E(X_i)=\mu$$ and variance $$V(X_i)=\sigma^2$$, then the distribution of

$$\dfrac{\overline{X}-\mu}{\sigma/\sqrt{n}}$$

converges to standard normal as the sample size $$n$$ gets larger. In other words, if $$n$$ is reasonably large, then $$\overline{X}$$ can be approximated by a normal random variable, regardless of what distribution the individual $$X_i$$ values came from. Let's see how this works for binomial data.

Suppose $$X_1,\ldots,X_n$$ are a random sample from a Bernoulli distribution with $$Pr(X_i=1)=1-Pr(X_i=0)=\pi$$ so that $$E(X_i)=\pi$$ and $$V(X_i)=\pi(1-\pi)$$. By the CLT,

$$\dfrac{\overline{X}-\pi}{\sqrt{\pi(1-\pi)/n}}$$

has an approximate standard normal distribution if $$n$$ is large. "Large" in this context usually means the counts of 1s and 0s (successes and failures) should be at least 5, although some authors suggest at least 10 to be more conservative. Note that since the $$X_i$$ take on the values 1 and 0, the sample mean $$\overline{X}$$ is just the sample proportion of 1s (successes).

## Normal to Chi-Square Section

The chi-square distribution with $$\nu$$ degrees of freedom can be defined as the sum of the squares of $$\nu$$ independent standard normal random variables. In particular, if $$Z$$ is standard normal, then $$Z^2$$ is chi-square with one degree of freedom. For the approximation above, we have (with $$Y=\sum_i X_i$$) that

$$\left(\dfrac{\overline{X}-\pi}{\sqrt{\pi(1-\pi)/n}}\right)^2 = \left(\dfrac{Y-n\pi}{\sqrt{n\pi(1-\pi)}}\right)^2$$

is approximately chi-square with one degree of freedom, provided $$n$$ is large. The advantage of working with a chi-square distribution is that it allows us to generalize readily to multinomial data when more than two outcomes are possible.

## Example: Smartphone Data Section

Recall our earlier binomial application to the 20 smartphone user data. If we assume that the population proportion of Android users is $$\pi=.4$$, then we can plot the exact binomial distribution corresponding to this situation---very close to the normal bell curve!

barplot(dbinom(0:20,20,.4), xlim=c(0,20), main="Binomial(20,.4)")