This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chapter 5 Properties of a Random Sample 5.1 Let X = # color blind people in a sample of size n . Then X binomial( n,p ), where p = . 01. The probability that a sample contains a color blind person is P ( X > 0) = 1 P ( X = 0), where P ( X = 0) = ( n ) ( . 01) ( . 99) n = . 99 n . Thus, P ( X > 0) = 1 . 99 n > . 95 n > log( . 05) / log( . 99) 299 . 5.3 Note that Y i Bernoulli with p i = P ( X i ) = 1 F ( ) for each i . Since the Y i s are iid Bernoulli, n i =1 Y i binomial( n,p = 1 F ( )). 5.5 Let Y = X 1 + + X n . Then X = (1 /n ) Y , a scale transformation. Therefore the pdf of X is f X ( x ) = 1 1 /n f Y x 1 /n = nf Y ( nx ). 5.6 a. For Z = X Y , set W = X . Then Y = W Z , X = W , and  J  = 1 1 1 = 1 . Then f Z,W ( z,w ) = f X ( w ) f Y ( w z ) 1, thus f Z ( z ) = R  f X ( w ) f Y ( w z ) dw . b. For Z = XY , set W = X . Then Y = Z/W and  J  = 1 1 /w z/w 2 = 1 /w . Then f Z,W ( z,w ) = f X ( w ) f Y ( z/w )  1 /w  , thus f Z ( z ) = R   1 /w  f X ( w ) f Y ( z/w ) dw . c. For Z = X/Y , set W = X . Then Y=W/Z and  J  = 1 w/z 2 1 /z = w/z 2 . Then f Z,W ( z,w ) = f X ( w ) f Y ( w/z )  w/z 2  , thus f Z ( z ) = R   w/z 2  f X ( w ) f Y ( w/z ) dw . 5.7 It is, perhaps, easiest to recover the constants by doing the integrations. We have Z  B 1+ ( ) 2 d = B, Z  D 1+ (  z ) 2 d = D and Z  " A 1+ ( ) 2 C 1+ (  z ) 2 # d = Z  " A 1+ ( ) 2 C (  z ) 1+ (  z ) 2 # d Cz Z  1 1+ (  z ) 2 d = A 2 2 log 1+ 2 C 2 2 log " 1+  z 2 #  Cz. The integral is finite and equal to zero if A = M 2 2 , C = M 2 2 for some constant M . Hence f Z ( z ) = 1 2 B D 2 Mz = 1 ( + ) 1 1+( z/ ( + )) 2 , if B = + , D = + ) , M = 2 2 z ( + ) 1 1+ ( z + ) 2 . 52 Solutions Manual for Statistical Inference 5.8 a. 1 2 n ( n 1) n X i =1 n X j =1 ( X i X j ) 2 = 1 2 n ( n 1) n X i =1 n X j =1 ( X i X + X X j ) 2 = 1 2 n ( n 1) n X i =1 n X j =1 h ( X i X ) 2 2( X i X )( X j X ) + ( X j X ) 2 i = 1 2 n ( n 1) n X i =1 n ( X i X ) 2 2 n X i =1 ( X i X ) n X j =1 ( X j X )  {z } =0 + n n X j =1 ( X j X ) 2 = n 2 n ( n 1) n X i =1 ( X i X ) 2 + n 2 n ( n 1) n X j =1 ( X j X ) 2 = 1 n 1 n X i =1 ( X i X ) 2 = S 2 . b. Although all of the calculations here are straightforward, there is a tedious amount of book keeping needed. It seems that induction is the easiest route. (Note: Without loss of generality we can assume 1 = 0, so E X i = 0.) (i) Prove the equation for n = 4. We have S 2 = 1 24 4 i =1 4 j =1 ( X i X j ) 2 , and to calculate Var( S 2 ) we need to calculate E( S 2 ) 2 and E( S 2 ). The latter expectation is straightforward and we get E(...
View
Full
Document
This note was uploaded on 04/18/2010 for the course STAT 622 taught by Professor Peruggia,m during the Spring '08 term at Ohio State.
 Spring '08
 Peruggia,M
 Binomial, Probability

Click to edit the document details