class06-2-handouts

class06-2-handouts - PSTAT 120B - Probability &...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: PSTAT 120B - Probability & Statistics Class # 06-2- Consistency Jarad Niemi University of California, Santa Barbara 5 May 2010 Jarad Niemi (UCSB) Consistency 5 May 2010 1 / 16 Class overview Announcements Announcements Homework Homework 5 is up, due 10 May by 4pm in SH 5521 Mid-term I Question # 14 had no correct answer. The answer should be m-1 2 . If m you got this question marked wrong, please see me to get one more point. Mid-term II Friday 14 May 11am-noon office hours will be short since I am teaching a class at noon Jarad Niemi (UCSB) Consistency 5 May 2010 2 / 16 Class overview Goals Consistency of estimators Jarad Niemi (UCSB) Consistency 5 May 2010 3 / 16 Consistency Intuition Suppose Y1 . . . , Yn N(, 2 ) and we use n = y n . What does this ^ estimate look like if we collect data sequentially? q q iid 1.0 1.0 q q q q q qq q q q q q q qq q q q q q q q q qq q q q qq qq q qq qq q qq q qq q q q q qq q qq q q qq q q qqq qq qq q q qqqq qqq qqq qq qq qqq qqq qq qqqqqqqqqq qq qqqqqqqqqq qq q q qqqq q qqq qqq qqq qqq qqqq q qq q qq q q qq q qqq qq qqqqqqq qqqqqqqqqq qqq qqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qqqqqqqqqqq qqqqqqqqqqqqqqq q q q q qqqqq qq qqqqqqq qqqqqqqqqqqqqq qqqqq qqqq q q qqqqqqq qqqq qqq qqqqqqqqqqqqqqqqq q q qq q q q qq qq qqq q qq qqqqqqqq qq q qqq q q q qqqqqqqqqq q q qqqqqq qq q qqqqqqqqq qq qqqqqqqqq qqqq q qq qq qqqqqqqqqqq qq q qqqqqq qqq q q qqqq qqq qqqqq qq qqqqq q q q qq q qq qq qqqqqq q q q qqqqqq qq qq q q q q qq q q qqqq q q qqqqq q q qq q qq q qqqqqqqqq q qqqqq qqqqqqqqqq qqq qqqqqq qqqqq q q q qq q q q qqq q qq q q q qqqq q q qqq q q qqq q q qq qqqqq qqqqqqqqqqqqqqq q qqqqq q q qq q qqqqqqq qq qq qqqqqq qqqqqqqqqqqqqqqqq q qqqqqq q q qq q q qqqqqqqq qq qqqqqqqqqqqqqq qqqqqqqqq qqqq qqq q qqq q q qqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqq q qqqqqqqqqqqqqqqqqqq qqqq qqqq qqq qq qq qq q q q qq qq qqqqqq q q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q qq qqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq qq q qq qqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q qq q q qq q qq q qqq qqqqqqq qqqqqqqqqq qqqq qq q q qq q qq qqqqqqqqqqqqq qq q q q q q q q q q q qqqqqqq qq q qqq qq qq qqqqqqqqqqqqqqqqqqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq q qq qq qq q qq q q qqq q qqqqqq qqqqq q qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q qq qqqq qq q qqqqqqqqqqqqqqqq qqq q q q q qq q q qq qqqqqqqqqqqq qqqq qq q q q qq qqq q qq q qqqq qq qq qq qqqq qqqqqqqq qq qqqqqqqqqqqqqqqqqq qqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqq q qq qq q q qqqq q qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q qq q q q qq q qq qqqqqq qqqq q qqq qqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq q q qq q qq qqqqqq qqq qq qqq qq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq q q qq q qqqq qqq q qqq qqq qq q q qq q q qq q q qqq q qqqqqqqqqqqqqqqq qqqq qqqqqqq qq qqqqqqqqqqqqq qqq qqqqqqqqqqqqqqqqqqq qq q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqqqq qqqqqqqqqqqq qqqqqqqqq qq q q q qqqqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqqq q q q q q qqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq q q qqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q qqqqqqqqqqqqqq qqqqq qqq qqqqqqqqqq qqqqqq qqqqq q qq q q q q qqqqq q qqqqqqqqqq qqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqq q q q q qqqqqqqqqqqq qqqqq qqqqqqqqqqqq qqqq q qq qqqq q q q qqqq q q q qqqqqqqq qqqqqqqq q q qq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q qq q qqqq qq q q qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq q q qq q q qqqqqqqqqqqqq qqqqqqqqqqqqq q q qqqqqqqqqqqq qqq qqq qqqqq q qqq qq qqq qqq qqqqqqqqq q q qqq qqq q q q qq qq qq q qqqqqqq q q q qq q qqqqqq qqqqqqq q q q qq qq qqqqqqq qq q qqqqqqqqqqqqqqqqq q qq q q qq qqq qq qqqqqqqqqqqqq qq qqqqqqqqq qqqq q q q qq q q q qqq q q qq q q qqqqqqqqq q qq q qqqqqqqqqq q q qqq q qq qqq qqqq qqq qq qq q qq qq qqqqq qq qq q q qq qq qq q qqq qqqq q q qqq q qq qqqq q qqqq q qq qqq qq qq qq qq q qqq q q qq q q qq qqq q q q qq q q qq qq q q q q qq q q q q q q q q q qq q q q q q q q q qq q q q qq q q q q q q q q q q q q q q q q 0.5 0.0 -0.5 -1.0 0 200 400 n 600 800 1000 -1.0 -0.5 0.0 q q q q q q qq q q q q q q q qq q qq qq qq q qq qqq qqqqqq q q qqqqq q q q qqqq q qqqqqqqqqq q q q qqq qqqqq q q q qq q q qqq q q q qq q q qqqqq q q q qqqq qqqqqqq q qqqqq qq q qqqqqqqqqqqqqq q q qqqqqqq q qqqqqqqqqqqqqq qq q q q qqqqq qqqqqqqqqqqqqqqqqq q qq q qqqq qqqqqqqqqqqqqqqqqqqqqqqqq qqq qqqqq q qqqq q qq qqqqqq qqqqqqqqqqqqqq qqq qqqqq q qq qq q q qq q qqq qqqqqqqqqqqqq q qqq q qq q q q q qq qqq qqq qqqqq q qqqq q q qqqqqqqqqqqqqqqqqq qqqq qq qqqqqqqqqqqqqq q q q q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq q yn yn 0.5 0 q q q q q 200 400 n 600 800 1000 Jarad Niemi (UCSB) Consistency 5 May 2010 4 / 16 Consistency Definition Definition A sequence of random variables {Xn } converges in probability to a random variable X , if for all > 0, n lim P(|Xn - X | > ) = 0 p and we use the notation Xn X . Definition ^ ^ p An estimator n is said to be a consistent estimator of if n . Jarad Niemi (UCSB) Consistency 5 May 2010 5 / 16 Consistency Examples Suppose Y1 . . . , Yn N(, 2 ) and we use n = y n . Is a consistent ^ ^ estimator for ? What is the distribution for n ? n N(, 2 /n). ^ ^ n ^ What is the distribution for /- ? n n - ^ / n iid N(0, 1) So, n lim P(|^n - | > ) = = n lim 2P(^n - > ) lim 2P n - ^ > n / n / n = lim 2P Z > n n = 0 where Z N(0, 1). Jarad Niemi (UCSB) Consistency 5 May 2010 6 / 16 Consistency Examples P Z> n is the black area under the curve. = 0.25, = 1 0.4 n=n= 1 n= 16 100 81 64 49 36 25 9 4 f(z) 0.0 0.1 0.2 0.3 -4 -2 0 z 2 4 Jarad Niemi (UCSB) Consistency 5 May 2010 7 / 16 Consistency of unbiased estimators Theorem ^ An unbiased estimator n for is a consistent estimator of if n ^ lim V (n ) = 0. Proof. If Y is a r.v. with E (Y ) = and V (Y ) = 2 < , then for k > 0 by Chebyshev's inequality P(g (Y ) > r ) E [g (Y )] r which implies P(|Y - | > k) 1 k2 ^ V (n ), then ^ for g (x) = (x - )2 / 2 and r = k 2 . Since E (n ) = and if we denote n = ^ ^ P(|n - | > ) = P Thus ^ 0 lim P(|n - | > ) lim n n ^ |n - | > n ^ n ^ ^ 1 V (n ) = . 2 ( /n )2 ^ ^ V (n ) 2 So if limn ^ V ( n ) 2 ^ 0, then n is consistent for . Consistency 5 May 2010 8 / 16 Jarad Niemi (UCSB) Consistency of unbiased estimators Example iid ^ Suppose Y1 . . . , Yn Po() and we use n = consistent estimator? 1 n n i=1 yi . ^ Is n a ^ Is n unbiased? ^ Does limn V (n ) = 0? ^ E (n ) = E 1 n 1 n n Yi i=1 n 1 = n iid n E [Yi ] = i=1 n 1 n = n 1 n = 2 n n ^ V (n ) ^ lim V (n ) = V Yi i=1 = 1 n2 V [Yi ] = i=1 n = n lim 1 = lim =0=0 n n n ^ So, yes n is a consistent estimator for . Jarad Niemi (UCSB) Consistency 5 May 2010 9 / 16 Consistency of unbiased estimators Weak law of large numbers Theorem Let Y1 , Y2 , . . . be iid random variables with E (Yi ) = and 1 V (Yi ) = 2 < . Define Y n = n n Yi . Then for every i=1 n > 0, lim P(|Y n - | > ) = 0 that is Y n is a consistent estimator of . This is known as the weak law of large numbers. Examples If Y1 , Y2 , . . . Exp(), then Y . 1 If Y1 , Y2 , . . . Geo(p), then Y p . iid iid p p iid p If Y1 , Y2 , . . . Unif (, + 1), then Y + 1 . 2 If Y1 , Y2 , . . . Be(, ), then Y Jarad Niemi (UCSB) Consistency iid p + . 5 May 2010 10 / 16 Consistency of biased estimators Theorem ^ A biased estimator n for is a consistent estimator of if 0 = = n n ^ ^ ^ ^ lim MSE (n ) = lim E [(n - )2 ] = lim V (n ) + B(n )2 n n ^ ^ lim V (n ) + lim B(n )2 . n ^ ^ ^ So n is a consistent estimator of if limn V (n ) = 0 and limn B(n ) since ^n ) = 0 implies limn B(n )2 = 0. ^ limn B( Proof. Again by Chevyshev's inequality, 0 n ^ ^ lim P[|n - | > ] = lim P[(n - )2 > n 2 n lim ^ E [(n - )2 ] 2 = 1 2 n ^ lim E [(n - )2 ] ^ p ^ If limn E [(n - )2 ] 0, then n . Jarad Niemi (UCSB) Consistency 5 May 2010 11 / 16 Consistency of biased estimators Example Suppose Y1 , Y2 , . . . Ber (p). Is the Bayes' estimator p = ^ consistent for p? E [^] p = E 1 + n yi i=1 2+n = 1+ E [yi ] 1 + np = 2+n 2+n 1 + np 2+n -p n i=1 iid 1+ n i=1 yi 2+n n lim B[^] p = = n lim (E [^] - p) = p n lim np -p =p-p =0 n V 1 + n yi i=1 2+n ind V [^] p = = V [yi ] np(1 - p) = (2 + n)2 (2 + n)2 n i=1 n lim V [^] p = n np(1 - p) n = p(1 - p) lim n (2 + n)2 (2 + n)2 1 n = p(1 - p) lim = p(1 - p) 0 = 0 p(1 - p) lim n (2 + n) n n(2 + n) lim Since both the bias and the variance go to zero as n , the Bayes' estimator is consistent. Jarad Niemi (UCSB) Consistency 5 May 2010 12 / 16 Consistency of biased estimators Example Bayes' estimator for binomial, q 0.5 q q q Truth q 0.4 ^ p 0.0 0.1 0.2 0.3 q q q qq q q q q qq q qq qq q qq q qq q qq qq qq q qq q qq q q q q q q qq q qq q q q q qq q q q q q q q q q q q q q q q q qq qq qq q q q q q q q qq q q qq q qq qq q q q qq q q qq qq qq q q qqqq qqq q q qqq qq qq q qq q qq q qqq q qq qqqq qq q qq qq qq qqqq q qq qqqqq qq q q q qq q q qqqq qqq q qq q qqq qqqqq q q qq q qq qq qq qqqqq qqqq qq qq qqqq qqq q q q qqq q q qq q q qq q q q qq q qqq qqqqqq qqqqq qq qq qqqqq qqqqq qqqqq qqqqqqqqq q qqq qqq qqq qqqqqqqqqqqq qqqqqqqqqqqq qqq qq qqqqq q qqq q q q q q q qq q q q q q qqqq qq qq q qq qqqq qqqq q qqqqqq qqqqqq qq q q qq q qq q qq q qqqqq qq q q q q q q q q qq qqqqqqq q q q qqqq qqqqq qqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q qqqqqqqqq q qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq q q qq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q qq q q q qqq qqqqq qqq q qqq qqqq qqqqq q q q q qqqqq qq qq qq q q qq q qq qq q q qq qqqqqqq q qqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqq q q q q q q qqqq q q q q qqq q q q q qq q qqqq q q q qqqqqqqqq qqq q q q q q q qq q q qqq q q q q q q q q q q q q q q q qq q q qq q qqqqq q q q q qqqq qq qq q qq qqqqq qq q q qqq qqqq qq qq q q qq qqqq qqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q qq q qqqqqq qq q q q qq qq qqqqqq qq qqqq qqqqqqqqqqqq q qqqq q q qqqq q q q q q qqq qqqq qqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqqqqqq qq q q qqqqqqqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqq q q qqqqqqq qq q q q qq q q q qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q qq q q qqqqqqqqqqqqqqqqqqqqqqq q qqq qqqqqqqqqqqq qq qqqqqqq qqqqq qq q q q qq qq q q q qq q qqqq q q q q qqqq qqq q q q qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q qqqqqqqqqqqqqq qq q qqqq q q q qq qqqqqqq qq q qqqqqqq qqqqqqq q q q qqqq q q qq qqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q q q q q qqqqqqqqq qqqqqqqqqq qqqq q qq qqqqqqqq qqq qqqqqqqqqqq qqqq qqq q q q qq q qq q q qqq qqqqq qqqqqqq q q qqq q q qq q qqqq q q q q q q q q qq q q q q q qq qq qqq qq q q q q q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q q q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq qqq qqq q qqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qq q qq q q q q q q q qqqqqq qqqqqqqqqqqqq q q qqqq q q q qqq qqqqqqqqqqq q q q q q q qq q q q q qqqqq qqq q qqqqqqqqqqqqq qq qqqq qq q q qqqq qqq qqqqqqqqqqqqqqqqqqqqq q q q qqqqqqqqqq q qq q qqq q q qqq qqqqqqqqqqqqqqqqqqqqqqq qqq q qqqqq qqqqqqqqqqqqqqqq qqqqqq qq q q qq qq q q qqqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqq qqqqqqqqqqqq q q q q q qq qqqq q q q q q q q qq q qq q q q q qqqqqq qq qqq qq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qq q qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqq qqqqqqqqqqqqqqq qqq qq q qqqqqqqqqq qq qqqqqqqq qqq qq q q q q qqqq q q qqq qq q qq q qq q q q qqq qq q q qq q q q q qq q qq q qq qqqq qqq qq qqq q qq q q q qqqqqq q qqqq q qq qqqqq qqq qqqq q q q q q q qqqqq q qqq q qqqqqqqqqqqqqqqqqqqqq qqq qqqq qq qq q qqqq qqq q q qq q q qq q qqq qqqqqqqq qqqqqqq qqq q qqqqqqqq qq qq q q q q qq q q qqqqqqqqqqq qq qqqqq qqq q q q q qqq q qq qq qq q q q q q 0 200 400 n 600 800 1000 Jarad Niemi (UCSB) Consistency 5 May 2010 13 / 16 Important consistent estimators Sample variance Theorem ^ ^ If n converges in probability to and n converges in probability to , then ^ ^ n + n converges in probability to + , ^ ^ n n converges in probability to , ^ ^ if = 0, n /n converges in probability to / , and ^ if g () is a real-valued function that is continuous at , then g (n ) converges in probability to g (). Corollary Let Y1 , Y2 , . . . be iid random variables with E (Yi ) = , V (Yi ) = 2 , and E (Yi4 ) < . Define 2 Sn = 1 n-1 n (Yi - Y n )2 i=1 and Yn = 1 n n Yi i=1 2 then Sn is a consistent estimator for 2 . Jarad Niemi (UCSB) Consistency 5 May 2010 14 / 16 Important consistent estimators Sample variance Theorem If Un N(0, 1) and Wn 1, then Un /Wn N(0, 1). d p d Corollary Let Y1 , Y2 , . . . be iid random variables with E (Yi ) = and V (Yi ) = 2 . Define n n 1 1 2 (Yi - Y n )2 and Y n = Yi . Sn = n-1 n i=1 i=1 then n Yn - Sn N(0, 1). d Jarad Niemi (UCSB) Consistency 5 May 2010 15 / 16 Next time Next time Sufficiency of estimators Rao-Blackwell Theorem and MVUE (9.5) Jarad Niemi (UCSB) Consistency 5 May 2010 16 / 16 ...
View Full Document

Ask a homework question - tutors are online