Section 16 - Statistics: An Introduction Fifth Edition...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Statistics: An Introduction Fifth Edition 584 Section 16: Regression A. Introduction Regression – Use scores from one variable to predict scores on a second variable. Examples : Predict Income from education Predict college grades from ACT test scores Predict college grades from high school grades Predict reading test scores from math test scores 585 Section 16: Regression Example Data from Section 14 Student Math Reading 15 04 5 24 55 0 36 05 5 47 06 0 56 0 63 03 5 74 0 85 5 94 5 10 40 60
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 586 Section 16: Regression Show data as a Scatter Diagram Students with high math scores tend to have high reading scores. But, relationship not perfect. 587 Section 16: Regression Refer to Data - Options Could predict scores using mean scores on reading for particular scores on math. What is average Reading score for people with Math score of 40? (40+35+60)/3=45 What is average Math score for people with reading score of 50? (45+55)/2=50 Could use these averages as predicted values. Problem: Each prediction based on small number of scores. 588 Section 16: Regression Could draw a line and use the line to make predictions: See Scatterdiagram. Draw a line. What is predicted value for a math score of 40? Of 50? Or 55? How would we have obtained predicted score for a math score of 55 taking averages? Problem: Drawing line by hand subject to error. Depends on person drawing the line.
Background image of page 2
3 589 Section 16: Regression B. Linear Regression – Optimal Process Notation for person i : X i is predictor (e.g., Math) Y i is criterion as observed (e.g., Reading) Y i ’ is predicted value (e.g., predicted Reading) Prediction Error Is difference between observed and predicted score for person i . i i i Y Y e 590 Section 16: Regression Illustration of linear prediction error Many lines are possible. How to pick best one? 591 Section 16: Regression C. Use Line of Best Fit – Least squares Prediction Line Find line where sum of squared prediction errors: Make G as small as possible. In linear regression, assume relationship is linear . Look at a scatter plot to help decide whether or not the relationships is linear.    n i n i i i i Y Y e G 11 2 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4 592 Section 16: Regression The equation for a line is Turns out that the line of best fit occurs when (Intercept) and (Slope) i i bX a Y X b Y a  n i i n i i i X X Y Y X X b 1 2 1 593 Section 16: Regression Simple Computational Example Student XY 11 36 299 377 453 515 7 X 6 Y 594
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/11/2011 for the course STAT 102 taught by Professor Jimmaxey during the Spring '11 term at Iowa State.

Page1 / 14

Section 16 - Statistics: An Introduction Fifth Edition...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online