13.2 - Two Factor Factorial with Random Factors
13.2 - Two Factor Factorial with Random FactorsImagine that we have two factors, say A and B, that both have a large number of levels which are of interest. We will choose a random levels of factor A and b random levels for factor B and n observations are made at each treatment combination. The corresponding linear model for this case and the respective variance components would be
\(y_{ijk}=\mu+\tau_i+\beta_j+(\tau\beta)_{ij}+\varepsilon_{ijk}
\left\{\begin{array}{c}
i=1,2,\ldots,a \\
j=1,2,\ldots,b \\
k=1,2,\ldots,n
\end{array}\right. \)
\(V(\tau_i)=\sigma^2_\tau,\ V(\beta_j)=\sigma^2_\beta,\ V((\tau\beta)_{ij})=\sigma^2_{\tau\beta},\ V(\varepsilon_{ijk})=\sigma^2\)
\(V(y_{ijk})=\sigma^2_\tau+\sigma^2_\beta+\sigma^2_{\tau\beta}+\sigma^2\)
Where \(\tau_i, \beta_j , (\tau \beta)_{ij}\) and \epsilon_{ijk}\) are all NID random variables with mean zero and variance as shown above. The relevant hypotheses that we are interested in testing are:
\(H_0 \colon \sigma^2_{\tau}=0\quad H_0 \colon \sigma^2_{\beta}=0\quad H_0 \colon \sigma^2_{\tau\beta}=0\)
\(H_1 \colon \sigma^2_{\tau}>0\quad H_1 \colon \sigma^2_{\beta}>0\quad H_1 \colon \sigma^2_{\tau\beta}>0\)
The numerical calculations for the analysis of variance are exactly like in the fixed effect case. However, we state once again, that to form the test statistics, the expected mean squares should be taken into account. We state the expected mean squares (EMS) here and assuming the hypothesis is true, we form the F test statistics, so that under that assumption, both the numerator and denominator of the F statistic have the same expectation. Note that the test for the main effects are no longer what they were in the fixed factor situation.
\(E(MS_A)=\sigma^2+n\sigma^2_{\tau\beta}+bn\sigma^2_\tau \Longrightarrow F_0=\frac{MS_A}{MS_{AB}}\)
\(E(MS_B)=\sigma^2+n\sigma^2_{\tau\beta}+an\sigma^2_\beta \Longrightarrow F_0=\frac{MS_B}{MS_{AB}}\)
\(E(MS_{AB})=\sigma^2+n\sigma^2_{\tau\beta}\qquad\quad \Longrightarrow F_0=\frac{MS_{AB}}{MS_{E}}\)
\(E(MS_E)=\sigma^2\)
Furthermore, variance components can again be estimated using the analysis of variance method by equating the expected mean squares to their observed values.
\({\hat{\sigma}}^2_{\tau}=\frac{MS_A-MS_{AB}}{bn}\)
\({\hat{\sigma}}^2_{\beta}=\frac{MS_B-MS_{AB}}{an}\)
\({\hat{\sigma}}^2_{\tau\beta}=\frac{MS_{AB}-MS_E}{n}\)
\({\hat{\sigma}}^2=MS_E\)
Example 13.2 in the textbook discusses a two-factor factorial with random effects on a measurement system capability study. These studies are often called gauge capability studies or gauge repeatability and reproducibility (R&R) studies. In this example three randomly selected operators are selected to measure twenty randomly selected parts, each part twice. Data obtained from the experiment is shown in Table 13.3. The variance components are
\(\sigma^2_y=\sigma^2_\tau+\sigma^2_\beta+\sigma^2_{\tau\beta}+\sigma^2\)
Typically, \(\sigma^2\) is called gauge repeatability because it shows the variation of the same part measured by the same operator and \(\sigma^{2}_{\beta} + \sigma^{2}_{\tau \beta}\)
which reflects the variation resulting from operators is called gauge reproducibility. Table 13.4 shows the analysis using Minitab’s Balanced ANOVA command.
Table 13-4 Analysis of variance (Minitab Balanced ANOVA) for Example 13-2
Analysis of Variance (Balanced Design)
Factor | Type | Levels | Values | ||||||
---|---|---|---|---|---|---|---|---|---|
part | random | 20 | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 | |||
15 | 16 | 17 | 18 | 19 | 20 | ||||
operator | random | 3 | 1 | 2 | 3 |
Analysis of Variance for Y
Source | DF | SS | MS | F | P |
---|---|---|---|---|---|
part | 19 | 1185.425 | 62.391 | 78.65 | 0.000 |
operator | 2 | 2.617 | 1.308 | 1.84 | 0.173 |
part*operator | 38 | 27.050 | 0.712 | 0.72 | 0.861 |
Error | 60 | 59.500 | 0.992 | ||
Total | 119 | 1274.592 |
Source | Variance Component | Error Term | Expected mean Square for Each term (using unrestricted model) |
---|---|---|---|
1 part | 10.2798 | 3 | (4) + 2(3) + 6(1) |
2 operator | 0.0149 | 3 | (4) + 2(3) + 40(2) |
3 part*operator | -0.1399 | 4 | (4) + 2(3) |
4 Error | 0.9917 | (4) | |
Table 13.4 (Design and Analysis of Experiments, Douglas C. Montgomery, 7th Edition) |
As it can be seen, the only significant effect is part. Estimates for components of variance and expected mean square for each term are given at the lower part of the table. Notice that the estimated variance for interaction term part*Operator is negative. The fact that the p-value for the interaction term is large along with the negative estimate of its variance is a good sign that the interaction term is actually zero. Therefore, we can proceed and fit a reduced model without part*operator term. The analysis of variance for the reduced model can be found in Table 13.5.
Table 13-5 Analysis of Variance for the Reduced Model, Example 13-2
Analysis of Variance (Balanced Design)
Factor | Type | Levels | Values | ||||||
---|---|---|---|---|---|---|---|---|---|
part | random | 20 | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 | |||
15 | 16 | 17 | 18 | 19 | 20 | ||||
operator | random | 3 | 1 | 2 | 3 |
Analysis of Variance for y
Source | DF | SS | MS | F | P |
---|---|---|---|---|---|
part | 19 | 1185.425 | 62.391 | 70.64 | 0.000 |
operator | 2 | 2.617 | 1.308 | 1.48 | 0.232 |
Error | 98 | 86.550 | 0.883 | ||
Total | 119 | 1274.592 |
Source | Variance Component | Error Term | Expected mean Square for Each term (using unrestricted model) |
---|---|---|---|
1 part | 10.2513 | 3 | (3) + 6(1) |
2 operator | 0.0106 | 3 | (3) + 40(2) |
3 Error | 0.8832 | (3) | |
Table 13.5 (Design and Analysis of Experiments, Douglas C. Montgomery, 7th Edition) |
Since the interaction term is zero, both of the effects is tested against the error term. Estimates of the variance component are given below at lower part of the table. Furthermore, as mentioned before, estimate of the variance of the gauge can be achieved as
\(\hat{\sigma}^2_{gauge}=\hat{\sigma}^2+\hat{\sigma}^2_{\beta}=0.88+0.01=0.89\)
which is relatively small comparing to the variability of the product.