# ch06 - Chapter 6. Supplemental Text Material S6-1. Factor...

This preview shows pages 1–3. Sign up to view the full content.

Chapter 6. Supplemental Text Material S6-1. Factor Effect Estimates are Least Squares Estimates We have given heuristic or intuitive explanations of how the estimates of the factor effects are obtained in the textbook. Also, it has been pointed out that in the regression model representation of the 2 k factorial, the regression coefficients are exactly one-half the effect estimates. It is straightforward to show that the model coefficients (and hence the effect estimates) are least squares estimates. Consider a 2 2 factorial. The regression model is yx x x x ii i i i = i + + + + β ε 01 12 21 2 1 2 The data for the 2 2 experiment is shown in the following table: Run, i X i1 X i2 X i1 X i2 Response total 1 -1 -1 1 (1) 2 1 -1 -1 a 3 -1 1 -1 b 4 1 1 1 ab The least squares estimates of the model parameters are chosen to minimize the sum of the squares of the model errors: Ly xx x x i i i =− −−− = ββ 2 1 2 2 1 4 bg i a b = It is straightforward to show that the least squares normal equations are 41 1 1 01 1 1 4 22 1 2 1 4 1 4 0 1 4 11 2 1 4 2 1 4 12 1 2 2 1 4 02 1 4 2 1 4 2 1 4 12 1 2 2 1 4 ±± ± ± () ±± ± ± ++ + = + + + ++ + = + + + = + == = == = = = = ∑∑ ∑∑ ∑ ∑ ∑ xa b x x xx a b a b x a i i i i i i i i i i i i i i i i i ba b b i i i i + +++ = + == = = ∑∑∑ ∑ ±±±± 2 1 4 2 2 1 4 2 2 1 4 12 1 2 2 2 1 4 1 Now since because the design is orthogonal, the normal equations reduce to a very simple form: x x i i i i i i i 1 1 4 2 1 4 1 4 1 2 2 1 4 2 1 4 0 = = === ∑∑∑∑∑ =

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
41 0 1 2 12 ± () ± ± ± β =+ + + =− + − + − + + + aba b b b b The solution is ± [( ) ] ± [() ] ± ] ± [( ) ] 0 1 2 12 1 4 1 4 1 4 1 4 = + + + = −+−+ = −−++ = −−+ b b b b These regression model coefficients are exactly one-half the factor effect estimates. Therefore, the effect estimates are least squares estimates. We will show this in a more general manner in Chapter 10. S6-2. Yates’s Method for Calculating Effect Estimates While we typically use a computer program for the statistical analysis of a 2 k design, there is a very simple technique devised by Yates (1937) for estimating the effects and determining the sums of squares in a 2 k factorial design. The procedure is occasionally useful for manual calculations, and is best learned through the study of a numerical example. Consider the data for the 2 3 design in Example 6-1. These data have been entered in Table 1 below. The treatment combinations are always written down in standard order, and the column labeled "Response" contains the corresponding observation (or total of all observations) at that treatment combination. The first half of column (1) is obtained by adding the responses in adjacent pairs. The second half of column (1) is obtained by changing the sign of the first entry in each of the pairs in the Response column and adding the adjacent pairs. For example, in column (1) we obtain for the fifth entry 5 = -(- 4) + 1, for the sixth entry 6 = -(-1) + 5, and so on.
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 03/20/2011 for the course STATISTIC 101 taught by Professor Fandia during the Spring '10 term at UCLA.

### Page1 / 11

ch06 - Chapter 6. Supplemental Text Material S6-1. Factor...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online