# 1.7.7 - Relationship between the Multinomial and Poisson Printer-friendly version

Suppose that X1, X2, . . ., Xk are independent Poisson random variables,

X1∼ P(λ1),
X2∼ P(λ2),
...
Xk∼ P(λk),

where the λj's are not necessarily equal. Then the conditional distribution of the vector

$X=(X_1,X_2,\ldots,X_k)$

given the total

$n=X_1+X_2+\ldots+X_k$

is Mult(n, π), where

$\pi=(\pi_1,\pi_2,\ldots,\pi_k)$

and

$\pi_j=\dfrac{\lambda_j}{\lambda_1+\lambda_2+\cdots+\lambda_k}$

That is, π is simply the vector of λj's normalized to sum to one.

This fact is important, because it implies that the unconditional distribution of (X1, . . . , Xk) can be factored into the product of two distributions: a Poisson distribution for the overall total,

$n\sim P(\lambda_1+\lambda_2+\cdots+\lambda_k)$

and a multinomial distribution for X = (X1, X2, . . . , Xk) given n,

$X\sim \text{Mult}(n,\pi)$

The likelihood factors into two independent functions, one for $\sum\limits_{j=1}^k \lambda_j$ and the other for π. The total n carries no information about π and vice-versa.

Therefore, likelihood-based inferences about π are the same whether we regard X1, . . ., Xk as sampled from k independent Poissons or from a single multinomial. That is, any estimates, tests, etc. for π or functions of π will be the same whether we regard n as random or fixed.

#### Example

Suppose that you wait at a busy intersection for one hour and record the color of each vehicle as it drives by. Let

X1 = number of white vehicles
X2 = number of black vehicles
X3 = number of silver vehicles
X4 = number of red vehicles
X5 = number of blue vehicles
X6 = number of green vehicles
X7 = number of any other color

In this experiment, the total number of vehicles observed,

$n=X_1+X_2+\cdots+X_7$

is random. (It would have been fixed if, for example, we had decided to classify the first n = 500 vehicles we see. But because we decided to wait for one hour, the n is random.)

In this case, it's reasonable to regard the Xj's as independent Poisson random variables with means λ1, λ2, . . . , λ7. But if our interest lies not in the λj's but in the proportions of various colors in the vehicle population, inferences about these proportions will be the same whether we regard the sample size n as random or fixed. That is, we can proceed as if

$X=(X_1,\ldots,X_7)\sim \text{Mult}(n,\pi)$

where π = (π1, . . . , π7), even though n is actually random.