5 - ECMT 1020 Summer School 09 Multiple Regression...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
ECMT 1020 Summer School 09 Multiple Regression Fundamentals
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Regression Regression modelling attempts to represent the empirical behaviour of a variable of interest (the dependent variable) as a function of some explanatory variables (the independent variables) and a random error term as in the model below. Y i = f( X 1i , X 2i , X 3i …..X ki ) + ε i The first part of the representation above is the systematic component and the ε i is the unsystematic component. The exact functional form and the particular independent variables to be included in the model is a matter of judgment.
Background image of page 2
Choosing Variables and Function Choosing the appropriate independent variables will be heavily reliant on economic theory, logic, the observed time series and the situational experience of the modeller. Typically, the modeller will consider the above and select a candidate group of variables most of which may be the independent variables in the final regression model. The functional form is another issue . Once again, logic, theory experience and the observed time series suggest the functional form of the model. How the predictive model will be used and what information is required may also influence the functional form.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
The Multiple Regression Model i pi p i i i X X X Y ε β + + + + + = 2 2 1 1 0 Relationship between 1 dependent & 2 or more independent variables may be a linear function Population Y-intercept Population slopes Dependent (Response) variable for sample Independent (Explanatory) variables for sample model Random Error i pi p i i i e X b X b X b b Y ˆ + + + + + = 2 2 1 1 0
Background image of page 4
Estimation Suppose the population regression model is Y i = β 0 + β 1 X 1 i + β 2 X 2 i + ε i • A joint sample of observations on Y i and X 1i and X 2i is collected. We need to determine sample estimates of the model coefficients β 0, β 1 and β 2 . (these sample estimates are b 0, b 1 and b 2 ) Estimation of regression model coefficients is typically via an technique called Ordinary Least Squares (OLS). OLS minimises the sum of squared deviations of the actual Y i values and the predicted Y i by choosing appropriate values of b 0, b 1 and b 2 .
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Why Use OLS?
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/10/2012 for the course ECON 1002 taught by Professor Markmelatos during the Three '10 term at University of Sydney.

Page1 / 29

5 - ECMT 1020 Summer School 09 Multiple Regression...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online