# lect7 - IE 410 Notes for Lecture 7 Today’s topics are •...

This preview shows pages 1–12. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: IE 410 Notes for Lecture 7 Today’s topics are: • Unequal sample sizes. • Estimation of model parameters: o Least Squares o Estimable functions • A start on Model Adequacy Checking UNEQUAL SAMPLE SIZES What happens if we have a different number of observations in each treatment? Example: FACTOR A 1 2 3 4 3.3 4.5 5.4 8.3 2.1 2.4 3.7 6.7 3.4 4.7 4.7 7.6 3.3 4.8 8.1 4.4 6.6 3.9 5.6 7.7 6.3 There are three relavent points: 1. In the case of One-way ANOVA, the design is still orthogonal the sum of squares decomposes, and the F test applies. This is an acceptable situation. The formulas must be changed a bit. 2. A "balanced design" in which the same number of observations are allocated to each treatment is better. WHY? Because it provides a more powerful test. Proof is beyond our scope at the moment. 3. In multifactor expt's, we will need to have balanced designs in order to maintain orthogonality. PARAMETER ESTIMATION Recall our basic model: Y ij = μ + τ i + ε ij s.t. i n = ∑ 1 τ i = 0 ε ij ~ NID(0 , σ 2 ) Question: what are the model parameters: Answer: μ , τ 1 ,......, τ a , σ 2 How can we estimate these parameters? We will show how to derive estimates using the method of Least Squares. It will reveal some things that are apparent, and some things that are rather strange (unestimable functions) Next we will derive the least squares normal equations and the least squares parameter estimates. Recall the basic model with unknown parameters μ and τ i Y ij = μ + τ i + ε ij Find the estimates of μ and τ i that minimize the squared errors ε ij The model prediction of the data points and the associated prediction errors are: ∑- ∑ = ∑ ∑ =- = + = j j ij Y ij Y ij L ij Y ij Y ij i ij Y i i 2 ˆ min ˆ ˆ ˆ ˆ ε ε τ μ ∑--- = ∂ ∂ ∑-- ∑- = ∂ ∂ ∑-- ∑ =...
View Full Document

## This note was uploaded on 08/06/2008 for the course IE 410 taught by Professor Storer during the Fall '04 term at Lehigh University .

### Page1 / 47

lect7 - IE 410 Notes for Lecture 7 Today’s topics are •...

This preview shows document pages 1 - 12. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online