By Theorem 4.24, the conditional PDF of
X
given
Y
is
f
X

Y
(
x

y
) =
f
X,Y
(
x, y
)
f
Y
(
y
)
=
braceleftbigg
1
1
−
y
y
≤
x
≤
1
0
otherwise
(4)
That is, since
Y
≤
X
≤
1,
X
is uniform over [
y,
1] when
Y
=
y
. The conditional expectation
of
X
given
Y
=
y
can be calculated as
E
[
X

Y
=
y
] =
integraldisplay
∞
−∞
xf
X

Y
(
x

y
)
dx
(5)
=
integraldisplay
1
y
x
1
−
y
dx
=
x
2
2(1
−
y
)
vextendsingle
vextendsingle
vextendsingle
vextendsingle
1
y
=
1 +
y
2
(6)
In fact, since we know that the conditional PDF of
X
is uniform over [
y,
1] when
Y
=
y
, it
wasn’t really necessary to perform the calculation.
Problem 4.9.5 Solution
Random variables
X
and
Y
have joint PDF
Y
X
1
1
f
X,Y
(
x, y
) =
braceleftbigg
2
0
≤
y
≤
x
≤
1
0
otherwise
(1)
For 0
≤
x
≤
1, the marginal PDF for
X
satisfies
f
X
(
x
) =
integraldisplay
∞
−∞
f
X,Y
(
x, y
)
dy
=
integraldisplay
x
0
2
dy
= 2
x
(2)
Note that
f
X
(
x
) = 0 for
x <
0 or
x >
1. Hence the complete expression for the marginal
PDF of
X
is
f
X
(
x
) =
braceleftbigg
2
x
0
≤
x
≤
1
0
otherwise
(3)
The conditional PDF of
Y
given
X
=
x
is
f
Y

X
(
y

x
) =
f
X,Y
(
x, y
)
f
X
(
x
)
=
braceleftbigg
1
/x
0
≤
y
≤
x
0
otherwise
(4)
Given
X
=
x
,
Y
has a uniform PDF over [0
, x
] and thus has conditional expected value
E
[
Y

X
=
x
] =
x/
2. Another way to obtain this result is to calculate
integraltext
∞
−∞
yf
Y

X
(
y

x
)
dy
.
Problem 4.9.9 Solution
Random variables
N
and
K
have the joint PMF
P
N,K
(
n, k
) =
100
n
e

100
(
n
+1)!
k
= 0
,
1
, . . . , n
;
n
= 0
,
1
, . . .
0
otherwise
(1)
2