This preview shows pages 1–3. Sign up to view the full content.
1
Lecture 6
Probability Generating Function
Definition:
The probability generating
function of a PMF;
is
defined as :
Theorem:
Let P(X=k) denote the PMF of a
RV X with probability generating function
(PGF)
differentiable at
Z=0. We
then have
k=0,…
()
(
)
x
X
P
kP
k
==
0
{ }
Xk
XX
b
Gz E
z
Pk
z
∞
=
∑
x
Gz
0
1

!
x
k
X
z
k
dG z
kd
z
=
±
Note:
If we differentiate and let z=1,we
recover moments of PMF.
±
Or
1
11
(
)

(
)
(
)
x
xx
x
k
zz
dG z
pkk
z
k
pk E
dz
−
=
∑∑
2
1
2

()( 1
)

x
x
k
p kkk
z
dz
−
=−
∑
2
(( 1
)
)
( )
x
x
EE
E
(
1
)
'
x
x
EG
=
'
'
2
(
1
)
(
1
) ( (
1
)
)
'
x
x
Var
G
G
G
=+−
Large Deviations and tail
probabilities
±
Computing probabilities for any distribution
is generally difficult.
±
Bounds on probabilities are in many
practical cases sufficient. e.g.
etc.
Why are they useful?
9
Exact computation is avoided
9
Easier to evaluate
.
.
.
PX a
β
≥<
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document2
Markov Inequality
±
Assume X is an RV in
,we can get a
good estimate of a Prob. of X in some
interval.
±
Lemma
: For a nonnegative RV X, with finite
mean E(x) then
we have
[0,
)
∞
0
a
∀>
(x)
()
x
E
Pa
a
≥≤
±
Proof:
±
Or
(
)
x
X
a
f
x
d
x
∞
≥=
∫
00
x
a
XX
X
a
E
xf
x dx
xf
x dx
xf
x dx
∞∞
==+
∫∫
∫
(
)
x
aa
xf
xdx a
f
xdx aP
a
≥≥
=
≥
x
x
E
a
Probability Bounds
±
Note:
The first Moment as expected
provides an estimate of
an order
of
probability.
±
Chebyshev Inequality
±
Variance is a measure of concentration
of an RV near its mean
.
±
This implies that we can use such a
quantity to estimate the likelihood of
being in a neighborhood
η
(,)
εη ε
−+
±
Theorem:
For any
±
Since :
±
Or
0
ε
>
(

)
( )
( )
ηε
P
Xf
x
d
x
f
x
d
x
∞
−∞
+
−≥ =
+

X
X
fx
d
x
−≥
=
∫
22
2
(
)
(
)
ση
η
X
x
d
x
x
d
x
−∞
− ≥
=−
≥
−
(

 )
ε
εη
ε
X
X
d
x
PX
∞
−
≥
∫
2
2
(

)
σ
This is the end of the preview. Sign up
to
access the rest of the document.
 Fall '08
 Krim

Click to edit the document details