3
•
It is a weighted
average
of all possible outcomes
, where the weights are the
probabilities of outcomes.
Discrete case
:
Suppose
X
can take
n
possible values, denoted as
x
1
,
x
2
, …,
x
n
(keep in mind the
notational convention).
Let
p
i
= prob (
X
= x
i
),
∑
p
୧
୬
୧
ୀଵ
ൌ1
,
and
p
୧
0
for
any
݅
.
Then
11
2 2
1
(
)
...
n
ii
nn
i
E
Xp
x
p
x
p
x
p
x
μ
=
≡=
=
+
+
+
∑
, where
E
() is the expectation
operator, and
is used to denote a mean.
Example
: Flip a coin, if head you get 1 dollar and if tail you get nothing.
Suppose it is a fair coin, then you are expected to get 0.5 dollars, that is, the
expected value of flipping a coin is 0.5 dollars, although if you flip a coin
another time you could get 1 dollar or nothing.
Facts about expected values
1.
For any constant a, E(a) = a. A constant does not have any variations
!
Example
: E(3)=3
2.
For any constants a and b, E(a + b
X
) = a + bE(
X
)
Example
: In the previous GPA and study time example, the estimated
regression line is
Y
=1+0.1
X
, where
Y
=GPA,
X
=study time per week
measured in hours.
Suppose the population mean of study time is 15 hours per week, that
is, E(
X
)=15, then GPA is expected to be E(
Y
) = 1+0.1 E(
X
) = 2.5
3.
For any constants ሼa
ଵ
,a
ଶ
,…,a
୩
ሽ, and r.v.’s ሼX
ଵ
,X
ଶ
,…,X
୩
ሽ, ܧሺ∑
ܽ
ܺ
ୀଵ
ሻ ൌ
∑
ܽ
ୀଵ
ܧሺܺ
ሻ .