GR5205_Fall2019_Sep20_annotated.pdf - var=25-40-40 0 0 y y...

This preview shows page 1 - 8 out of 21 pages.

Note:PROOF:21
22
DEFINITION2.2 Theresidualdenotedeiis the difference between the observed valueyiandits corresponding fitted valueˆyi:ei=yi-ˆyiThe fitted value is given byˆyi=ˆβ0+ˆβ1xiwhereˆβ0andˆβ1are the least squares estimators ofβ0andβ1.
23
PROOF:24
25
2.4Maximum Likelihood Estimation of the Simple Linear RegressionModelConsider a random sampleX1, X2, . . . , Xneach having common probability density function(or probability mass function)f(xi|)whereis a generic parameter of that distribution.could also be a vector of parameters. The joint probability density function (or joint probabilitymass function) isf(x1, x2, . . . , xn|) =f(x1|)f(x2|)· · ·f(xn|).Define the likelihood function asL(;x1, x2, . . . , xn) =f(x1, x2, . . . , xn|).(2.4)It is often convenient working with the log-likelihood functionlog(L(;x1, x2, . . . , xn)) = log(f(x1, x2, . . . , xn|)).Note:DEFINITION2.3 Themaximum likelihood estimateˆis the value ofthat maximizes thelikelihood function (2.4), so thatL(ˆ;x1, x2, . . . , xn)L(;x1, x2, . . . , xn).Example 5LetX1, X2, . . . , Xnbe a random sample from an exponential distribution each having commonprobability density functionf(xi|μ) =1(-), xi0.Find the maximum likelihoodestimator ofμ.μexp1μxi26
Maximim likelihood estimators ofβ0,β1, andσ2LetY1, Y2, . . . , Yn

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture