PDB_Stat_100_Lecture_24_Printable

PDB_Stat_100_Lecture_24_Printable - STA 100 Lecture 24 Paul...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
STA 100 Lecture 24 Paul Baines Department of Statistics University of California, Davis March 4th, 2011
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Admin for the Day I No more homework! I Project Proposals feedback completed I Office Hours today – project questions please! I Extra office hours Monday – 9.30-11.30am, 3.00-4.30pm I Please stop by on Monday with any questions! References for Today: Rosner, Ch 11 (7th Ed.) References for Monday: Rosner, Ch 11 (7th Ed.)
Background image of page 2
Linear Regression Last time we saw that we can investigate the relationship between two variables using linear regression: Y i N ( α + β X i , σ 2 ) Another way of writing this is: Y i = ( α + β x i ) + ± i , ± i iid N (0 , σ 2 ) I Given explanatory variable x i the expected value of Y i is α + β x i . I As we vary x i this traces out a straight-line relationship between x and y . I Variations above and below the straight-line should be normally distributed with variance σ 2 .
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Key Assumptions There are five very fundamental assumptions we make: 1. The mean of Y is a linear function of X : This describes the core of the relationship between X and Y i.e., the straight line. 2. The variability of the data points around the line around follows a normal distribution: This describes one aspect of how the points are allowed to vary around the line. 3. The variability of the data points around the line around does not vary as X changes (i.e., it is just σ 2 ): This describes another aspect of how the points are allowed to vary around the line. 4. Conditional on X , the Y ’s are independent: This describes another aspect of how the points are allowed to vary around the line. 5. The explanatory variable X is observed without error: Everything we do here is conditional on the X ’s.
Background image of page 4
Diagnosing Problems The more complex the model we build, the more things there are that can go wrong. For linear regression, we really to need to know what our assumptions mean, so that we can know what warning signs to look for. . . Last time we looked at some examples to see if we could tell when assumptions were wrong, and when they were reasonable.
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Example 1 1 2 3 4 5 6 -8 -6 -4 -2 0 2 4 6 Example where linearity is not true x y Mean of y = a + bx + cx^2 + dx^3
Background image of page 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 37

PDB_Stat_100_Lecture_24_Printable - STA 100 Lecture 24 Paul...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online