{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Chapter02 - STAT 563 Regression Methods Simple Linear...

Info iconThis preview shows pages 1–11. Sign up to view the full content.

View Full Document Right Arrow Icon
STAT 563: Regression Methods Simple Linear Regression
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
What is a Regression Model? Essentially a mathematical formula relating one variable, referred to as a response variable, to one or more other variables, called predictor variables Often used in exploratory studies and generally not intended for causal relationships Uses of regression models include Data description Parameter estimation Prediction and estimation Control Example: predict average annual salary (Y) using time (t) as a predictor variable A regression model is a model of the conditional expectation E(annual salary|time)
Background image of page 2
Simple Linear Regression Suppose we have n pairs (Y i , X i ) of data on (salary, time) Specify the model for (X 1 ,X 2 ,…X n ) Normality is not needed for estimation but required for inference Fit the model: estimate ( β 0 1 29? Least Squares Computation starts with finding Inference: Draw conclusions on β 1 based on n observations ) , 0 ( ~ 2 1 0 nxn i i i I N X Y σ ε ε β β + + = ( 29 2 1 1 0 , 1 0 1 0 min arg ) ˆ , ˆ ( = - - = n i i i X y β β β β β β ) ˆ , ˆ ( 1 0 β β
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Least Squares Least squares regression finds the line that minimizes Normal Equations: = - - = n i i i X y S 1 2 1 0 1 0 ) ( ) , ( β β β β 0 ) ( 2 0 ) ( 2 1 1 0 1 1 1 0 0 = - - - = = - - - = = = n i i i i n i i i X X y S X y S β β β β β β
Background image of page 4
Least Squares Simplifying and solving for (substituting the estimates for the parameters): The fitted model is: ) ˆ , ˆ ( 1 0 β β = = = = = = - = - - = = - = n i i n i i n i i xx n i i i yx xx yx X n X y n y X X S X X y y S where S S X y 1 1 1 2 1 1 1 0 1 , 1 , ) ( ) )( ( ˆ ˆ ˆ β β β X y 1 0 ˆ ˆ ˆ β β + =
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Geometry of Least Squares The least squares estimate of the data is an orthogonal projection of data vector onto the independent variable subspace
Background image of page 6
Geometry of Least Squares For each pair ( β 0 1 29 the vector with components The S can now be expressed as Minimizing this length over ( β 0 1 29 finds the vector closest to the plane L spanned by 1 , X . Or, the least squares projects the vector y onto the plane L. ) , ( 1 0 β β P i i X P 1 0 ) , ( , 1 0 β β β β + = 2 ) , ( 2 1 ) , ( , 1 0 1 0 ) ( β β β β P y P y n i i i - = - = ) ˆ , ˆ ( 1 0 β β P
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Residuals The residuals e i are defined as In vector notation , or, e is the projection of y onto the orthogonal complement L of the plane L spanned by 1 , X . This implies The vector of residuals e is independent of i i i y y e ˆ - = y y e ˆ - = 0 ˆ 0 0 1 1 1 1 = = = = = = = = = e y y e e X X e e e T i n i i T n i i i T n i i y ˆ
Background image of page 8
Estimating σ 2 Required to Test hypotheses Construct confidence intervals Ideally, we would like to have an estimate independent of adequacy of fitted model Only possible if we have multiple observations on y for at least one value of X or if prior information is available on σ 2 In absence, we will use error (or residual) sum of squares
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Estimating σ 2 If ( β 29 is known then and Now, an unbiased estimate can be obtained from is an unbiased estimate of σ 2 i i i X y 1 0 β β ε - - = 2 2 1 0 1 2 2 ) , ( n n i i SSE χ σ β β ε ε = = = = 2 1 2 1 σ ε = = n i i n E 2 ε
Background image of page 10
Image of page 11
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}