PDB_Stat_100_Lecture_24_Printable

# PDB_Stat_100_Lecture_24_Printable - STA 100 Lecture 24 Paul...

This preview shows pages 1–8. Sign up to view the full content.

STA 100 Lecture 24 Paul Baines Department of Statistics University of California, Davis March 4th, 2011

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Admin for the Day I No more homework! I Project Proposals feedback completed I Oﬃce Hours today – project questions please! I Extra oﬃce hours Monday – 9.30-11.30am, 3.00-4.30pm I Please stop by on Monday with any questions! References for Today: Rosner, Ch 11 (7th Ed.) References for Monday: Rosner, Ch 11 (7th Ed.)
Linear Regression Last time we saw that we can investigate the relationship between two variables using linear regression: Y i N ( α + β X i , σ 2 ) Another way of writing this is: Y i = ( α + β x i ) + ± i , ± i iid N (0 , σ 2 ) I Given explanatory variable x i the expected value of Y i is α + β x i . I As we vary x i this traces out a straight-line relationship between x and y . I Variations above and below the straight-line should be normally distributed with variance σ 2 .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Key Assumptions There are ﬁve very fundamental assumptions we make: 1. The mean of Y is a linear function of X : This describes the core of the relationship between X and Y i.e., the straight line. 2. The variability of the data points around the line around follows a normal distribution: This describes one aspect of how the points are allowed to vary around the line. 3. The variability of the data points around the line around does not vary as X changes (i.e., it is just σ 2 ): This describes another aspect of how the points are allowed to vary around the line. 4. Conditional on X , the Y ’s are independent: This describes another aspect of how the points are allowed to vary around the line. 5. The explanatory variable X is observed without error: Everything we do here is conditional on the X ’s.
Diagnosing Problems The more complex the model we build, the more things there are that can go wrong. For linear regression, we really to need to know what our assumptions mean, so that we can know what warning signs to look for. . . Last time we looked at some examples to see if we could tell when assumptions were wrong, and when they were reasonable.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Example 1 1 2 3 4 5 6 -8 -6 -4 -2 0 2 4 6 Example where linearity is not true x y Mean of y = a + bx + cx^2 + dx^3

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 03/09/2011 for the course STAT 100 taught by Professor drake during the Spring '10 term at UC Davis.

### Page1 / 37

PDB_Stat_100_Lecture_24_Printable - STA 100 Lecture 24 Paul...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online