# note08 - 1 Summary of Least Squares Estimator Model...

This preview shows pages 1–5. Sign up to view the full content.

1 Summary of Least Squares Estimator Model Specification - Some regressors can be squares of another variable (e.g., ). - and can be the levels of the original variables, or some function of the original variables (e.g., . - Some regressors can be dummy variables to allow different intercepts or slopes across different groups. - These factors must be taken into consideration when the marginal effects of regressors are analyzed. How to Estimate OLS (Ordinary Least Squares) estimators of coefficients minimize OLS estimator of the variance of error term: where Properties of OLS estimators OLS estimators of coefficients are BLUE if (1) for all i and j (2) for all i and j (3) for all i s and j OLS estimator is the best quadratic unbiased estimator under the same assumptions. Statistical Inferences - Hypothesis, Test Statistic, and Rejection Region We need an additional assumption about the distribution of error terms: (4) Specification of Hypotheses Specify the null hypothesis and alternative hypothesis (a) (b)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 (c) (d) Test Statistic Choose a statistic that is computable and that can be used to test the hypothesis. For example, (a) &(b) under (c) under (d) under where q is the number of restrictions under ( q =2 in hypothesis in (d)). Decision Rule - Choice of Rejection Region The decision rule specifies when to reject and when to accept . (a) & (c) Reject if or (or ) (b) Reject if (d) Reject if Consequences of the decision rule Null Hypothesis is correct is false reject Type I error (correct decision) accept (correct decision) Type II error How to choose the critical values? Choose the critical values such that the probability of Type I error is equal to the prespecified level " of significance (a)&(c) (b) (d)
3 p-value of a test The maximum level of significance at which is rejected for a given value of the test statistic. The following figures show how to find the p -value when the value of the test statistic is 2.41.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
4 Chapter 8. Heteroskedasticity In this chapter we consider a case in which the assumption of homoskedastic errors is violated. The assumption (2) says that the variances of error terms are all identical (2) for all i and j If the variance is not the same for all individuals, it is called the heteroskedastic errors. What we would like to learn in this chapter is (i) what are the properties of the OLSE if homoskedasticity is not valid, (ii) what do we do if we know that errors are heteroskedstic? (iii) how to detect heteroskedasticity . Consequences of heteroskedasticity (a) OLS estimator is unbiased because the unbiasedness requires only assumption (1). (b) OLS estimator is not the best (not BLUE) (c) OLS estimator of the error variance is incorrect. (d) OLS estimator of variance of
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 08/22/2011 for the course ECON 7436 taught by Professor Su during the Three '11 term at University of Adelaide.

### Page1 / 13

note08 - 1 Summary of Least Squares Estimator Model...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online