{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Chapter12print

# Chapter12print - STAT503 Fall 2009 Lecture Notes Chapter 12...

This preview shows pages 1–5. Sign up to view the full content.

STAT503 — Fall 2009 Lecture Notes: Chapter 12 1 Chapter 12: Linear Regression November 18, 2009 12.1 Introduction In linear regression, to explain values of a continuous response variable Y we use a continuous explanatory variable X . We will have pairs of observations of two numerical variables ( X, Y ): ( x 1 , y 1 ) , ( x 2 , y 2 ) , . . . , ( x n , y n ). Examples: X = concentration, Y = rate of reaction, X = weight, Y = height, X = total Homework score to date, Y = total score on Tests to date. They are represented by points on the scatterplot .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
STAT503 — Fall 2009 Lecture Notes: Chapter 12 2 Two Contexts 1. Y is an observed variable and the values of X are specified by the experimenter. 2. Both X and Y are observed variables . If the experimenter controls one variable , it is usually labeled X and called the explanatory variable . The response variable is the Y . When X and Y are both only observed , the distinction between explanatory and response variables is somewhat arbitrary, but must be made as their roles are different in what follows. 12.2 The Fitted Regression Line Equation for the Fitted Regression Line This is the “closest” line to the points of the scatterplot. We consider Y a linear function of X plus a random error . We will first need some notation to describe the influence of X on Y : The following are as usual: SS x = n X i =1 ( x i - ¯ x ) 2 SS y = n X i =1 ( y i - ¯ y ) 2 s x = r SS x n - 1 s y = r SS y n - 1 One new quantity is the sum of products: r = 1 n - 1 n X i =1 x i - ¯ x s x y i - ¯ y s y = ( n i =1 x i y i ) - n ¯ x ¯ y ( n - 1) · s x · s y . We consider a linear model : Y = β 0 + β 1 X + random error . β 0 is called the intercept and β 1 is called the slope . We only have a sample so we will estimate β 0 and β 1 : We estimate β 1 by b 1 = r s y s x We estimate β 0 by b 0 = ¯ y - b 1 ¯ x .
STAT503 — Fall 2009 Lecture Notes: Chapter 12 3 The line y = b 0 + b 1 x is the “best” straight line though the data. It is also known as the “least-squares line”. (Explanations will be given later.) We will call it the fitted regression line . Example: Let X be the total score on our Homeworks to date (in points) and Y be the total score on Tests (in points). The following summary statistics were obtained: n = 99 ¯ x = 546 . 76 ¯ y = 117 . 07 SS x = 990098 . 2 SS y = 62442 . 5 s x = 100 . 5 s y = 25 . 2 r = 0 . 8 We obtain b 1 = rs y /s x = 0 . 2012 . and b 0 = ¯ y - b 1 ¯ x = 117 . 07 - 0 . 2012 * 546 . 76 = 7 . 065 . Note: use many significant digits of b 1 to calculate b 0 . The fitted regression line is Tests = 7 . 065 + 0 . 2012 * Homeworks . Here the plot for our data with “predicteds” and regression line:

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
STAT503 — Fall 2009 Lecture Notes: Chapter 12 4 [Discussion]How do we interpret slope and intercept in linear equations? Consider, e.g., F = 32 + 1 . 88 C , and the above equation.. Predicteds and Residual Sum of Squares For each value of x i in the sample there is a value of y predicted by the fitted regression line.
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 14

Chapter12print - STAT503 Fall 2009 Lecture Notes Chapter 12...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online