This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chapter 7 Point Estimation 7.1 For each value of x , the MLE ˆ θ is the value of θ that maximizes f ( x  θ ). These values are in the following table. x 1 2 3 4 ˆ θ 1 1 2 or 3 3 3 At x = 2, f ( x  2) = f ( x  3) = 1 / 4 are both maxima, so both ˆ θ = 2 or ˆ θ = 3 are MLEs. 7.2 a. L ( β  x ) = n Y i =1 1 Γ( α ) β α x α 1 i e x i /β = 1 Γ( α ) n β nα " n Y i =1 x i # α 1 e Σ i x i /β log L ( β  x ) = log Γ( α ) n nα log β + ( α 1) log " n Y i =1 x i # ∑ i x i β ∂ log L ∂β = nα β + ∑ i x i β 2 Set the partial derivative equal to 0 and solve for β to obtain ˆ β = ∑ i x i / ( nα ). To check that this is a maximum, calculate ∂ 2 log L ∂β 2 β = ˆ β = nα β 2 2 ∑ i x i β 3 β = ˆ β = ( nα ) 3 ( ∑ i x i ) 2 2( nα ) 3 ( ∑ i x i ) 2 = ( nα ) 3 ( ∑ i x i ) 2 < . Because ˆ β is the unique point where the derivative is 0 and it is a local maximum, it is a global maximum. That is, ˆ β is the MLE. b. Now the likelihood function is L ( α,β  x ) = 1 Γ( α ) n β nα " n Y i =1 x i # α 1 e Σ i x i /β , the same as in part (a) except α and β are both variables. There is no analytic form for the MLEs, The values ˆ α and ˆ β that maximize L . One approach to finding ˆ α and ˆ β would be to numerically maximize the function of two arguments. But it is usually best to do as much as possible analytically, first, and perhaps reduce the complexity of the numerical problem. From part (a), for each fixed value of α , the value of β that maximizes L is ∑ i x i / ( nα ). Substitute this into L . Then we just need to maximize the function of the one variable α given by 1 Γ( α ) n ( ∑ i x i / ( nα )) nα " n Y i =1 x i # α 1 e Σ i x i / (Σ i x i / ( nα )) = 1 Γ( α ) n ( ∑ i x i / ( nα )) nα " n Y i =1 x i # α 1 e nα . 72 Solutions Manual for Statistical Inference For the given data, n = 14 and ∑ i x i = 323 . 6. Many computer programs can be used to maximize this function. From PROC NLIN in SAS we obtain ˆ α = 514 . 219 and, hence, ˆ β = 323 . 6 14(514 . 219) = . 0450. 7.3 The log function is a strictly monotone increasing function. Therefore, L ( θ  x ) > L ( θ  x ) if and only if log L ( θ  x ) > log L ( θ  x ). So the value ˆ θ that maximizes log L ( θ  x ) is the same as the value that maximizes L ( θ  x ). 7.5 a. The value ˆ z solves the equation (1 p ) n = Y i (1 x i z ) , where 0 ≤ z ≤ (max i x i ) 1 . Let ˆ k = greatest integer less than or equal to 1 / ˆ z . Then from Example 7.2.9, ˆ k must satisfy [ k (1 p )] n ≥ Y i ( k x i ) and [( k + 1)(1 p )] n < Y i ( k + 1 x i ) . Because the righthand side of the first equation is decreasing in ˆ z , and because ˆ k ≤ 1 / ˆ z (so ˆ z ≤ 1 / ˆ k ) and ˆ k + 1 > 1 / ˆ z , ˆ k must satisfy the two inequalities. Thus ˆ k is the MLE....
View
Full
Document
 Spring '08
 Peruggia,M
 Maximum likelihood, Estimation theory, Yi, Bias of an estimator

Click to edit the document details