Econometrics-I-2 - Applied Econometrics William Greene...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
Applied Econometrics William Greene Department of Economics Stern School of Business
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Applied Econometrics 2. Regression and Projection
Background image of page 2
Statistical Relationship Objective :  Characterize the stochastic relationship between a variable  and a set of 'related' variables  Context:   An inverse demand equation,  P =   α   +   β Q  +   γ Y, Y = income.  Q and P are two obviously related random  variables.  We are interested in studying the relationship between P and Q. By ‘relationship’ we mean (usually) covariation.  (Cause and effect is  problematic.) Distinguish between Bayesian and Classical views of how this study  would proceed. β  is the ‘parameter' of interest.  A ‘true parameter’ (frequentist) or a  characteristic of the state of the world that can only be described in probabilistic  terms (Bayesian). The end result of the study:  An ‘estimate of  β ’ (classical) or an estimated  distribution of  β  (Bayesian).  The counterpart is an estimate of the mean of the  posterior distribution. From this point forward, with only minor exceptions, we will focus on the  classical methods. 
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Bivariate Distribution - Model for a Relationship Between Two Variables We might posit a bivariate distribution for Q and P,  f(Q,P)  How does variation in P arise?  With variation in Q, and  Random variation in its distribution.  There exists a conditional distribution f(P|Q) and a  conditional mean function, E[P|Q].  Variation in  P  arises  because of  Variation in the mean,  Variation around the mean,  (possibly) variation in a covariate, Y.
Background image of page 4
Implications Structure is the theory Regression is the conditional mean There is always a conditional mean It may not equal the structure It may be linear in the same variables What is the implication for least squares estimation? LS estimates regressions LS does not necessarily estimate structures Structures may not be estimable – they may not be  identified.
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
The conditional mean function is the  regression  function . P  =  E[P|Q]  +  (P - E[P|Q])  =   E [P|Q] +  ε E[ ε |Q] = 0 = E[ ε ].  Proof?  Any takers?  (Law of iterated  expectations) Variance of the conditional random variable =  conditional variance, or the  scedastic function. A “trivial relationship” may be written as P = h(Q) +  ε where the random variable  ε =P-h(Q) has zero mean by  construction.  Looks like a regression “model” of sorts,  but h(Q) is only E[P|Q] for one specific function. An extension:  Can we carry 
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 09/27/2008 for the course FM 101 taught by Professor Greece during the Spring '08 term at New York College of Podiatric Medicine.

Page1 / 38

Econometrics-I-2 - Applied Econometrics William Greene...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online