lec3 - The contributions of distinct sets of explanatory...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: The contributions of distinct sets of explanatory variables to the model are typically captured by breaking up the overall regression (or model) sum of squares into distinct components. This is useful quite generally in linear models, but especially in ANOVA models where the response is modeled in terms of one or more class vari- ables or factors. In such cases, the model sum of squares is decomposed into sums of squares for each of the distinct sets of dummy, or indicator, variables necessary to capture each of the factors in the model. For example, the following model is appropriate for a randomized complete block design (RCBD) y ij = + j + i + e ij where y ij is the response from the i th treatment in the j th block, and j and i are block and treatment effects, respectively. This model can also be written as y = j n + 1 b 1 + + b b b + 1 t 1 + + a t a + e ( * ) In this context, the notation SS ( | , ) denotes the extra regression sum of squares due to fitting the i s after fitting and the j s and is given by SS ( | , ) = y T ( P C ( X )- P C ( X 1 ) ) y where X 1 = ( j n , b 1 , . . . , b b ) and X = ( X 1 , t 1 , . . . , t a ). Sums of squares like this one that can be computed by fitting suc- cessively more complex models and taking the difference in regres- sion/model sum of squares at each step are called sequential sums of squares . They represent the contribution of each successive group of explana- tory variables above and beyond those explanatory variables already in the model. 201 Any model that can be written as y = X + e = X 1 1 + X 2 2 + X 3 3 + + e has a sequential sum of squares decomposition. That is, the regression or model sum of squares SS Model = y T P C ( X ) y = k P C ( X ) y k 2 can always be decomposed as follows: SS Model = k P C ( X ) y k 2 = k P C ( X 1 ) y k 2 + k ( P C ( X 1 , X 2 )- P C ( X 1 ) ) y k 2 + k ( P C ( X 1 , X 2 , X 3 )- P C ( X 1 , X 2 ) ) y k 2 + or SS Model = SS ( 1 ) + SS ( 2 | 1 ) + SS ( 3 | 1 , 2 ) + Note that by construction, the projections and squared lengths of projections in such a decomposition are independent because the spaces onto which we are projecting are mutually orthogonal. Such a decomposition can be extended to any number of terms. 202 Consider the RCBD model (*). This model can be written as y = X 1 1 + X 2 2 + X 3 3 + e where X 1 = j N , X 2 = ( b 1 , . . . , b b ) , X 3 = ( t 1 , . . . , t a ) and 1 = , 2 = ( 1 , . . . , b ) T , and 3 = ( 1 , . . . , a ) T . The sequential break-down of the model sum of squares here is SS Model = SS ( ) + SS ( | ) + SS ( | , ) ( ** ) Consider the null hypothesis H : 1 = = a = 0. The null model corresponding to this hypothesis is y ij = + j + e ij ....
View Full Document

This note was uploaded on 11/13/2011 for the course STAT 8260 taught by Professor Hall during the Summer '10 term at University of Georgia Athens.

Page1 / 11

lec3 - The contributions of distinct sets of explanatory...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online