STP452topic3

STP452topic3 - STAT 512 Applied Regression Analysis Topic 3...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
STAT 512: Applied Regression Analysis Topic 3 Spring 2008 Chapter 5: Linear Regression in Matrix Form The SLR Model in Scalar Form Y i = β 0 + β 1 X i + ² i , ² i iid N (0 2 ) . Consider now writing an equation for each observation: Y 1 = β 0 + β 1 X 1 + ² 1 Y 2 = β 0 + β 1 X 2 + ² 2 . . . . . . . . . Y n = β 0 + β 1 X n + ² n The SLR Model in Matrix Form Y 1 Y 2 . . . Y n = β 0 + β 1 X 1 β 0 + β 1 X 2 . . . β 0 + β 1 X n + ² 1 ² 2 . . . ² n = 1 X 1 1 X 2 . . . . . . 1 X n β 0 β 1 + ² 1 ² 2 . . . ² n where (we use bold symbols for matrices and vectors) X is called the design matrix β is the vector of parameters ² is the error vector Y is the response vector 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
The design matrix X = 1 X 1 1 X 2 . . . . . . 1 X n R n × 2 Vector of parameters β = β 0 β 1 R 2 × 1 Vector of error terms ² = ² 1 ² 2 . . . ² n R n × 1 Vector of responses Y = Y 1 Y 2 . . . Y n R n × 1 And we have Y = X β + ² . Variance-Covariance Matrix In general, for any set of variables U 1 ,U 2 ,...,U n , their variance-covariance matrix is de ned to be σ 2 { U } = σ 2 { U 1 } σ { U 1 ,U 2 } ... σ { U 1 ,U n } σ { U 2 ,U 1 } . . . . . . . . . σ { U n - 1 ,U n } σ { U n ,U 1 } ... σ { U n ,U n - 1 } σ 2 { U n } where σ 2 { U i } is the variance of U i , and σ { U i ,U j } is the covariance of U i and U j . When variables are uncorrelated, that means their covariance is 0. The variance-covariance matrix of uncorrelated variables will be a diagonal matrix, since all the covariances are 0. 2
Background image of page 2
Note: Variables that are independent will also be uncorrelated. So when variables are correlated they are automatically dependent. However, it is possible to have variables that are dependent but uncorrelated, because correlation only measures linear dependence. A nice thing about normally distributed random variables is that they are a convenient special case: if they are uncorrelated they are also independent. Covariance matrix of ² σ 2 { ² } = Cov ² 1 ² 2 . . . ² n = σ 2 I n × n = σ 2 0 ... 0 0 σ 2 0 . . . . . . . . . 0 0 ... σ 2 Covariance matrix of Y σ 2 { Y } = Cov Y 1 Y 2 . . . Y n = σ 2 I n × n . Distributional Assumptions in Matrix Form ² N ( 0 2 I ) where I is the n × n identity matrix. Ones in the diagonal elements specify that the variance of each
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 24

STP452topic3 - STAT 512 Applied Regression Analysis Topic 3...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online