Chapter 5 summary
Continuous Random Variables
•
Definition of a continuous random variable:
A random variable called
continuous
(it is also
called absolutely continuous) if there exists a function
f
X
with the property that for any two numbers
a, b
with
a < b
, the probability that
X
lies between
a
and
b
is
P
(
a < X < b
) =
∞
∞
f
X
(
x
)
dx.
The function
f
X
is called the
probability density function
of
X
, or just the density function of
X
. It
follows from this that for any
a
,
P
(
X
=
a
) = 0, so the above probability is the same as
P
(
a
≤
X
≤
b
).
•
General properties of a probability density function:
f
X
(
x
)
≥
0 for all
x
∈
R
, and
∞
∞
f
X
(
x
)
dx
= 1
.
•
Relation between the distribution and the density of a continuous random variable:
The
distribution is still defined to be
F
X
(
x
) =
P
(
X
≤
X
). We have
F
X
=
f
X
,
and
F
X
(
x
) =
x
∞
f
x
(
t
)
dt.
•
Expected value of a continuous random variable:
E
[
X
] =
∞
∞
x f
X
(
x
)
dx.
•
Expected value of a function of a continuous random variable:
E
[
g
(
X
)] =
∞
∞
g
(
x
)
f
X
(
x
)
dx.
(1)
•
Variance and standard deviation of a continuous random variable:
The variance of
X
is
defined to be
V ar
(
X
) =
E
[(
X

E
(
X
))
2
] =
∞
∞
(
x

E
(
X
))
2
f
X
(
x
)
dx,
but it is usually calculated using the simpler formula
V ar
(
X
) =
E
[
X
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '08
 Moumen,F
 Probability, Variance, Probability theory, probability density function, continuous random variable, memoryless

Click to edit the document details