{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# lec9 - Simple Linear Regression Multiple Linear Regression...

This preview shows pages 1–8. Sign up to view the full content.

Simple Linear Regression Multiple Linear Regression Simple and Multiple Linear Regression Artin Armagan Sta. 113 Chapter 12 and 13 of Devore March 31, 2009 Artin Armagan Simple and Multiple Linear Regression

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Simple Linear Regression Multiple Linear Regression Table of contents 1 Simple Linear Regression Estimation Inference 2 Multiple Linear Regression Artin Armagan Simple and Multiple Linear Regression
Simple Linear Regression Multiple Linear Regression Estimation Inference Model A simple linear regression model is given by Y = β 0 + β 1 x + where Y is the reponse x is the predictor β 0 is the unknown intercept of the line β 1 is the unknown slope of the line N (0 , σ 2 ) is the noise with unknown variance σ 2 Artin Armagan Simple and Multiple Linear Regression

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Simple Linear Regression Multiple Linear Regression Estimation Inference Model Notice that Y is a random quantity due to only E ( Y ) = β 0 + β 1 x V ( Y ) = σ 2 Y N ( β 0 + β 1 x , σ 2 ) Artin Armagan Simple and Multiple Linear Regression
Simple Linear Regression Multiple Linear Regression Estimation Inference Assumptions Notice that A linear underlying relationship between the response and the predictor Normality of random noise Constant variance of random noise all throughout the data Independence of random noise Artin Armagan Simple and Multiple Linear Regression

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Simple Linear Regression Multiple Linear Regression Estimation Inference Least Squares Estimation Find the line passing through the data points such that the sum of squared vertical distances from this line to the data points is minimized. min b 0 , b 1 n X i =1 ( y i - b 0 - b 1 x i ) 2 Since this is a minimization problem, taking the derivatives with respect to b 0 and b 1 and setting them equal to zero will result in two equations which are called the normal equations . nb 0 + ( X x i ) b 1 = 0 ( X x i ) b 0 + ( X x 2 i ) b 1 = X x i y i Artin Armagan Simple and Multiple Linear Regression
Simple Linear Regression Multiple Linear Regression Estimation Inference Least Squares Estimation If we solve this system we obtain b 1 = ˆ β 1 = ( x i - ¯ x )( y i

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern