Lecture Notes
IEOR 165  George Shanthikumar
Tu W Th 24:30 pm  3113 Etcheverry
Goes over MOM & MLE example
Based on graphical interpretation, we can decide that
=
.
AMLE
a certain xminor xmax
And then we solve for
= [
]
EAMLE
E Xminor max
=
+
If EAMLE
nn 1a
then this is a biased estimator
:
So we multiply
: = +
= +
.
AMLE c
n 1nAMLE
n 1nxmin
is an unbiased estimator of a
To decide which is better estimator, we see which variance is smaller.
Estimation & Hypothesis Testing of Bernoulli Population
Effectiveness of an ad campaign. (find statistically significant improvement)
Let x be a {0,1} r.v. (Bernoulli r.v.) with probability of success P (=P{x=1})
Suppose we have a sample
,…,
.
x1
xnand we want to estimate P
Estimate P: (use average)
=
=
=
=
Ex
ppMOM
xbar
1nk
1nxk
: =
=

 =
Lx p
pk
1nxk1 pn k
1nxk
Understanding that
=  ,
=
fxx
1 p
x
0
=
,
=
= * +
= *  =
+ (
 )(  )
fxx
p
x
1Ix
1 p Ix
1 1 p
xp
1 x
1 p
So if we have a vector
(1
0
0
1
1
0
0
0
1), then we multiply by
P*(1p)(1p)p*p*(1p)(1p)(1p)*p
So if we take the log of likelihood function:
: =
=
+  =
 ,
≤ ≤
logLx p
logpk
1nxk n k
1nxk1 p
0
p
1
: =
=


 =
=


 =
=
ddplogLx p
1pk
1nxk 11 pn k
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Summer '08
 SHANTHIKUMAR
 Statistics, Normal Distribution, George Shanthikumar Tu, Lecture Notes IEOR

Click to edit the document details