# lec31 - .1 Statistical inference in simple linear regres...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 31 31.1 Statistical inference in simple linear regres- sion. Let us first summarize what we proved in the last two lectures. We considered a simple linear regression model Y = β + β 1 X + ε where ε has distribution N (0 , σ 2 ) and given the sample ( X 1 , Y 1 ) , . . . , ( X n , Y n ) we found the maximum likelihood estimates of the parameters of the model and showed that their joint distribution is described by ˆ β 1 ∼ N β 1 , σ 2 n ( X 2- ¯ X 2 ) , ˆ β ∼ N β , 1 n + ¯ X 2 n ( X 2- ¯ X 2 ) σ 2 Cov( ˆ β , ˆ β 1 ) =- ¯ Xσ 2 n ( X 2- ¯ X 2 ) and ˆ σ 2 is independent of ˆ β and ˆ β 1 and n ˆ σ 2 σ 2 ∼ χ 2 n- 2 . Suppose now that we want to find the confidence intervals for unknown parameters of the model β , β 1 and σ 2 . This is straightforward and very similar to the confidence intervals for parameters of normal distribution. For example, using that n ˆ σ 2 /σ 2 ∼ χ 2 n- 2 , if we find the constants c 1 and c 2 such that χ 2 n- 2...
View Full Document

## This note was uploaded on 10/11/2009 for the course STATISTICS 18.443 taught by Professor Dmitrypanchenko during the Spring '09 term at MIT.

### Page1 / 4

lec31 - .1 Statistical inference in simple linear regres...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online