This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Week 5 Lecture 9 A lower bound by Tsybakov Parameter space = { , 1 ,..., M } (1) d ( i , j ) 2 s , for all 0 i 6 = j M. Usually s is the rate of convergence you have obtained by a specific procedure, and d is a distance related to the loss function. Reduction to bounds in probability For any which may not be in , define * = arg min j d , j Then inf sup E d 2 , s 2 inf sup P d , s s 2 inf sup j M P j * 6 = j = s 2 inf sup j M P j 6 = j Usually we construct the parameter in a way such that the minimax probability of error p e,M = inf sup j M P j 6 = j c for some fixed constant c > 0, then a lower bound cs 2 is obtained. Lower bound for minimax probability of error p e,M sup > M 1 + M (2) where = 1 M M j =1 P j ( A j ) with A j = n d P d P j > o ....
View Full
Document
 '09
 HarrisonH.Zhou

Click to edit the document details