{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Econometrics-I-15

Shephards lemma states that the cost minimizing

Info iconThis preview shows pages 33–42. Sign up to view the full content.

View Full Document Right Arrow Icon
Shephard’s Lemma states that the cost minimizing factor demands are given by Xm = C(…)/Pm. Take logs gives the factor share equations, logC(…)/logPm = Pm/C  C(…)/Pm = PmXm/C which is the proportion of total cost spent on factor m. ™  32/45
Background image of page 33

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Part 15: Generalized Regression Applications Translog ™  33/45
Background image of page 34
Part 15: Generalized Regression Applications Restrictions ™  34/45
Background image of page 35

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Part 15: Generalized Regression Applications Data – C&G, N=123 ™  35/45
Background image of page 36
Part 15: Generalized Regression Applications ™  36/45
Background image of page 37

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Part 15: Generalized Regression Applications ---------------------------------------------------------------------- Ordinary least squares regression ............ LHS=C Mean = -.38339 Standard deviation = 1.53847 Number of observs. = 123 Model size Parameters = 10 Degrees of freedom = 113 Residuals Sum of squares = 2.32363 Standard error of e = .14340 Fit R-squared = .99195 Adjusted R-squared = .99131 Model test F[ 9, 113] (prob) = 1547.7(.0000) --------+------------------------------------------------------------- Variable| Coefficient Standard Error t-ratio P[|T|>t] Mean of X --------+------------------------------------------------------------- Constant| -7.79653 6.28338 -1.241 .2172 Y| .42610*** .14318 2.976 .0036 8.17947 YY| .05606*** .00623 8.993 .0000 35.1125 PK| 2.80754 2.11625 1.327 .1873 .88666 PL| -.02630 (!) 2.54421 -.010 .9918 5.58088 PKK| .69161 .43475 1.591 .1144 .43747 PLL| .10325 .51197 .202 .8405 15.6101 PKL| -.48223 .41018 -1.176 .2422 5.00507 YK| -.07676** .03659 -2.098 .0381 7.25281 YL| .01473 .02888 .510 .6110 45.6830 --------+------------------------------------------------------------- Least Squares Estimate of Cost Function ™  37/45
Background image of page 38
Part 15: Generalized Regression Applications Criterion function for GLS is log-likelihood. Iteration 0, GLS = 514.2530 Iteration 1, GLS = 519.8472 Iteration 2, GLS = 519.9199 ---------------------------------------------------------------------- Estimates for equation: C ......................... Generalized least squares regression ............ LHS=C Mean = -.38339 Residuals Sum of squares = 2.24766 Standard error of e = .14103 Fit R-squared = .99153 Adjusted R-squared = .99085 Model test F[ 9, 113] (prob) = 1469.3(.0000) --------+------------------------------------------------------------- Variable| Coefficient Standard Error b/St.Er. P[|Z|>z] Mean of X --------+------------------------------------------------------------- Constant| -9.51337** 4.26900 -2.228 .0258 Y| .48204*** .09725 4.956 .0000 8.17947 YY| .04449*** .00423 10.521 .0000 35.1125 PK| 2.48099* 1.43621 1.727 .0841 .88666 PL| .61358 1.72652 .355 .7223 5.58088 PKK| .65620** .29491 2.225 .0261 .43747 PLL| -.03048 .34730 -.088 .9301 15.6101 PKL| -.42610 .27824 -1.531 .1257 5.00507 YK| -.06761*** .02482 -2.724 .0064 7.25281 YL| .01779 .01959 .908 .3640 45.6830 --------+------------------------------------------------------------- FGLS ™  38/45
Background image of page 39

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Part 15: Generalized Regression Applications ----------------------------------------------------------- Constrained MLE for Multivariate Regression Model First iteration: 0 F= -48.2305 log|W|= -7.72939 gtinv(H)g= 2.0977 Last iteration: 5 F= 508.8056 log|W|= -16.78689 gtinv(H)g= .0000 Number of observations used in estimation = 123 Model: ONE PK PL PKK PLL PKL Y YY YK YL C B0 BK BL CKK CLL CKL CY CYY CYK CYL SK BK CKK CKL CYK SL BL CKL CLL CYL --------+-------------------------------------------------- Variable| Coefficient Standard Error b/St.Er. P[|Z|>z] (FGLS) (OLS) --------+-------------------------------------------------- B0| -6.71218*** .21594 -31.084 .0000 -9.51337 -7.79653 CY| .58239*** .02737 21.282 .0000 .48204 .42610 CYY| .05016*** .00371 13.528 .0000 .04449 .05606 BK| .22965*** .06757 3.399 .0007 2.48099 2.80754 BL| -.13562* .07948 -1.706 .0879 .61358 -.02630 CKK| .11603*** .01817 6.385 .0000 .65620 .69161 CLL| .07801*** .01563 4.991 .0000 -.03048 .10325 CKL| -.01200 .01343 -.894 .3713 -.42610 -.48223 CYK| -.00473* .00250 -1.891 .0586 -.06761 -.07676 CYL| -.01792*** .00211 -8.477 .0000 .01779 .01473 --------+-------------------------------------------------- Maximum Likelihood Estimates ™  39/45
Background image of page 40
Part 15: Generalized Regression Applications Vector Autoregression    The vector autoregression (VAR) model is one of the most successful, flexible, and easy to use models for the analysis of multivariate time series. It is a natural extension of the univariate autoregressive model to dynamic multivariate time series. The VAR model has proven to be especially useful for describing the dynamic behavior of economic and financial time series and for forecasting. It often provides superior forecasts to those from univariate time series models and elaborate theory-based simultaneous equations models. Forecasts from VAR models are quite flexible because they can be
Background image of page 41

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 42
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}