ISM_chapter16 - Chapter 16 Introduction to Bayesian Methods...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
326 Chapter 16: Introduction to Bayesian Methods of Inference 16.1 Refer to Table 16.1. a. (10,30) β b. n = 25 c. (10,30) , n = 25 d. Yes e. Posterior for the (1, 3) prior. 16.2 a. - d. Refer to Section 16.2 16.3 a. - e. Applet exercise, so answers vary. 16.4 a. - d. Applex exercise, so answers vary. 16.5 It should take more trials with a beta(10, 30) prior. 16.6 Here, y n y p p y n p y p p y L = = ) 1 ( ) | ( ) | ( , where y = 0, 1, …, n and 0 < p < 1. So, 1 1 ) 1 ( ) ( ) ( ) ( ) 1 ( ) , ( β α β Γ α Γ β + α Γ × = p p p p y n p y f y n y so that ) ( ) ( ) ( ) ( ) ( ) ( ) 1 ( ) ( ) ( ) ( ) ( 1 0 1 1 β + α + Γ β + Γ α + Γ β Γ α Γ β + α Γ = β Γ α Γ β + α Γ = β + α + n y n y dp p p y n y m y n y . The posterior density of p is then 1 1 * ) 1 ( ) ( ) ( ) ( ) | ( β + α + β + Γ α + Γ β + α + Γ = y n y p p y n y n y p g , 0 < p < 1. This is the identical beta density as in Example 16.1 (recall that the sum of n i.i.d. Bernoulli random variables is binomial with n trials and success probability p ). 16.7 a. The Bayes estimator is the mean of the posterior distribution, so with a beta posterior with α = y + 1 and β = n y + 3 in the prior, the posterior mean is 4 1 4 4 1 ˆ + + + = + + = n n Y n Y p B . b. p n np n Y E p E B + + = + + = 4 1 4 1 ) ( ) ˆ ( , 2 2 ) 4 ( ) 1 ( ) 4 ( ) ( ) ˆ ( + = + = n p np n Y V p V 16.8 a. From Ex. 16.6, the Bayes estimator for p is 2 1 ) | ( ˆ + + = = n Y Y p E p B . b. This is the uniform distribution in the interval (0, 1). c. We know that n Y p / ˆ = is an unbiased estimator for p . However, for the Bayes estimator,
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Chapter 16: Introduction to Bayesian Methods of Inference 327 Instructor’s Solutions Manual 2 1 2 1 ) ( ) ˆ ( + + = + + = n np n Y E p E B and 2 2 ) 2 ( ) 1 ( ) 2 ( ) ( ) ˆ ( + = + = n p np n Y V p V B . Thus, 2 2 2 2 2 ) 2 ( ) 2 1 ( ) 1 ( 2 1 ) 2 ( ) 1 ( )] ˆ ( [ ) ˆ ( ) ˆ ( + + = + + + + = + = n p p np p n np n p np p B p V p MSE B B B . d. For the unbiased estimator p ˆ , MSE( p ˆ ) = V ( p ˆ ) = p (1 – p )/ n . So, holding n fixed, we must determine the values of p such that n p p n p p np ) 1 ( ) 2 ( ) 2 1 ( ) 1 ( 2 2 < + + . The range of values of p where this is satisfied is solved in Ex. 8.17(c). 16.9 a. Here, p p p y p p y L y 1 ) 1 ( ) | ( ) | ( = = , where y = 1, 2, … and 0 < p < 1. So, 1 1 1 ) 1 ( ) ( ) ( ) ( ) 1 ( ) , ( β α β Γ α Γ β + α Γ × = p p p p p y f y so that ) ( ) 1 ( ) 1 ( ) ( ) ( ) ( ) 1 ( ) ( ) ( ) ( ) ( 1 0 2 β + α + Γ β + Γ + α Γ β Γ α Γ β + α Γ = β Γ α Γ β + α Γ = + β α y y dp p p y m y . The posterior density of p is then *2 () (|) ( 1 ) (1 ) ( 1 ) y y gpy p p y αβ α β + Γ++ =− Γ+Γ+− , 0 < p < 1. This is a beta density with shape parameters α * = α + 1 and β * = β + y – 1. b. The Bayes estimators are , 1 ) | ( ˆ ) 1 ( Y Y p E p B + β + α + α = = , ) )( 1 ( ) 1 )( 1 ( ) )( 1 ( ) 1 )( 2 ( 1 ) | ( ) | ( )] 1 ( [ ) 2 ( 2 Y Y Y Y Y Y Y p E Y p E p p B + β + α + + β + α + β + α = + β + α + + β + α + α + α + β + α + α = = where the second expectation was solved using the result from Ex. 4.200. (Alternately, the answer could be found by solving = 1 0 * ) | ( ) 1 ( ] | ) 1 ( [ dp Y p g p p Y p p E .
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 03/18/2009 for the course STA 4321 taught by Professor Staff during the Spring '08 term at University of Florida.

Page1 / 8

ISM_chapter16 - Chapter 16 Introduction to Bayesian Methods...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online