7.1.3 - Hotelling’s T-Square

7.1.3 - Hotelling’s T-Square

A more preferable test statistic is Hotelling’s \(T^2\) and we will focus on this test.

To motivate Hotelling's \(T^2\), consider the square of the t-statistic for testing a hypothesis regarding a univariate mean. Recall that under the null hypothesis, t has a distribution with n-1 degrees of freedom. Now consider squaring this test statistic as shown below:

\[t^2 = \frac{(\bar{x}-\mu_0)^2}{s^2/n} = n(\bar{x}-\mu_0)\left(\frac{1}{s^2}\right)(\bar{x}-\mu_0) \sim F_{1, n-1}\]

When you square a t-distributed random variable with n-1 degrees of freedom, the result is an F-distributed random variable with 1 and n-1 degrees of freedom. We reject \(H_{0}\) at level \(α\) if \(t^2\) is greater than the critical value from the F-table with 1 and n-1 degrees of freedom, evaluated at level \(α\).

\(t^2 > F_{1, n-1,\alpha}\)

Hotelling's T-Square

Consider the last term in the above expression for \(t^2\). In the expression for Hotelling's \(T^2\), the difference between the sample mean and \(\mu_{0}\) is replaced with the difference between the sample mean vector and the hypothesized mean vector \(\boldsymbol{\mu _{0}}\). The inverse of the sample variance is replaced by the inverse of the sample variance-covariance matrix S, yielding the expression below:

\(T^2 = n\mathbf{(\overline{X}-\mu_0)'S^{-1}(\overline{X}-\mu_0)}\)

Notes on \(\mathbf{T^2}\)

For large n, \(T^2\) is approximately chi-square distributed with p degrees of freedom.

If we replace the sample variance-covariance matrix, S, with the population variance-covariance matrix, \(Σ\)

\(n\mathbf{(\overline{X}-\mu_0)'\Sigma^{-1}(\overline{X}-\mu_0)},\)

then the resulting test is exactly chi-square distributed with p degrees of freedom when the data are normally distributed.

For small samples, the chi-square approximation for \(T^2\) does not take into account variation due to estimating \(Σ\) with the sample variance-covariance matrix S.

Better results can be obtained from the transformation of the Hotelling \(T^2\) statistic as below:

\[F = \frac{n-p}{p(n-1)}T^2 \sim F_{p,n-p}\]

Under the null hypothesis, \(H_{0}\colon \boldsymbol{\mu} = \boldsymbol{\mu_{0}}\), this will have an F distribution with p and n-p degrees of freedom. We reject the null hypothesis, \(H_{0}\), at level \(α\) if the test statistic F is greater than the critical value from the F-table with p and n-p degrees of freedom, evaluated at level \(α\).

\(F > F_{p, n-p, \alpha}\)

To illustrate the Hotelling's \(T^2\) test we will return to the USDA Women’s Health Survey data.


Legend
[1]Link
Has Tooltip/Popover
 Toggleable Visibility