Lecture9-Addendum - i hence ∂‘ i/∂μ i = y i-μ i/V...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Derivation of the estimating equations for GLM Remember the general form of the log likelihood for a GLM The i -th component of the log likelihood is i = [ y i θ i - b ( θ i )] /a ( φ ) + c ( y i ) . (1) Most often we have a ( φ ) = φ . Note that the functinon b ( . ) determines the moment of Y i μ i = E ( Y i ) = b 0 ( θ i ) V ar ( Y i ) = b 00 ( θ i ) a ( φ ) Indeed, we know by definition that E ( ∂‘ i /∂θ i ) = 0 where E ( ∂‘ i /∂θ i ) = E { ( y i - b 0 ( θ i )) /a ( φ ) } Thus this implies that E ( Y i ) = μ i = b 0 ( θ i ) We also have - E ( 2 i /∂θ i ) = 1 /V ar ( θ i ) = b 00 ( θ i ) /a ( φ ), thus, V ( θ i ) = a ( φ ) /b 00 ( θ i ), and V ( Y i ) = V ( μ i ) = V ( b 0 ( θ i )) = b 00 2 ( θ i ) V ( θ i ) = b 00 ( θ i ) a ( φ ) By differentiating the log likelihood according to β , the likelihood estimating equations are N X i =1 ( y i - μ i ) var ( Y i ) ∂μ i ∂β i = N X i =1 ( y i - μ i ) x ij var ( Y i ) ∂μ i ∂η i = 0 , j = 1 , ··· ,p. (2) where p is the number of explanatory variables. Proof We want to compute ∂‘ i /∂θ i 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
and note that ∂‘ i /∂θ i = ∂‘ i /∂μ i × ∂μ i /∂η i × ∂η i /∂θ i We first derive ∂‘ i /∂μ i ∂‘ i /∂μ i = ∂‘ i /∂θ i × ∂θ i /∂μ i = ( y i - μ i ) /a ( φ ) × ( ∂μ i /∂θ i ) - 1 where ( ∂μ i /∂θ i ) - 1 = b 00 ( θ i ) - 1 = a ( φ ) /V ar ( Y
Background image of page 2
Background image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: i ) hence ∂‘ i /∂μ i = ( y i-μ i ) /V ar ( Y i ) We easily derive ∂η i /∂θ i = x ij j = 1 , ··· ,p We can now deduct ∂‘ i /∂θ i = ( y i-μ i ) /V ar ( Y i ) x ij ∂μ i /∂η i The final estimating equations are obtained by taking for our parameters of interest θ i = β i and summing over all individuals. The term ∂μ i /∂η i is specific to the GLM chosen. 2 Let’s take for example a binary response Y i = { , 1 } We know that μ i represents the probability of success P ( Y i = 1), where μ i = exp ( η i ) / (1 + exp ( η i )) Thus we have ∂μ i /∂η i = exp ( η i ) / (1 + exp ( η i )) 2 Note also that the function of the dispersion parameter a ( φ ) drops out from the estimation equation. 3...
View Full Document

Page1 / 3

Lecture9-Addendum - i hence ∂‘ i/∂μ i = y i-μ i/V...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online