{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Entropy - Section ARE213 Entropy Econometrics Estimation...

Info icon This preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Section ARE213 Entropy Econometrics Estimation Procedure by Golan, Judge and Miller, 1996 May 2006 Hendrik Wolff
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Short Review on MaxEntropy 1948 Claude Shannon: Information Entropy: H( p ) = - j=1 p j ln p j Example of Dice: 6 Support Points z j : z 1 , z 2 ,..., z 6 Max p H( p ) s.t. 3.5 = j=1...6 p j z j 1.0 = j=1 p j Solution: p j =1/6: I.e. Uniform distribution of the discrete PDF f(z) How does f(z) look like, if we don‘t observe the theoretical mean of 3.5 ? EXCEL
Image of page 2
Solution to the ME-Problem Max p H( p ) = Max p [- j=1 p j ln p j | y = j=1...6 p j z j , 1.0 = j=1 p j ] Lagrange: L = - j=1 p j ln p j +λ (y - Z p ) + θ(1.0 - p´1 ) δL/δ p = - ln p - 1 - Z´ λ - θ = 0 δL/δ λ = y- Z p = 0 δL/δ θ = 1- p´1 = 0 --> p k = exp(-Z k ´λ) / Ω k (λ) with Ω(λ) = Σ i=1...6 exp(-Z k ´λ) No analytical solution (parallels Logit) Newton worsk since H is globally concave
Image of page 3

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Normalized Entropy Measuring the Information Content: „Importance of the contribution of each piece of data in reducing uncertainty“ S(p) = (- m p m ln p m ) / ln(M) S(p) = 0 : no Uncertainty in the System: p i =1, p j =0 S(p) = 1 : Perfect Uncertainty
Image of page 4
Short Review on Cross-Entropy...
Image of page 5

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}