VEQSiON M
1. (8 points) You are given two r.v.s, Y1 and Y2, whose variances are V[Y1] = V[Y2] = 3 and whose covariance
is 2. You dene a new r.v. U = 3Yl Y2. What is V[U]?
VEUE=QTiQ = (5 4) 3 I (3 = (a
VERVRDH 'M
1. (6 points) You repeatedly toss two fair twelve-sided dice. What is the expected number of throws you need
to make until the rst time you observe the same number on each die face (i.e., u
Notes 8: The Multivariate Normal
Associated Reading: Wackerly 7, Chapter 5, Section 10
In this notes set, we will introduce the multivariate normal distribution. Wackerly 7 barely touches upon it, by
WERS'KDN B
1. (6 points) You repeatedly toss two fair eightsided dice. What is the expected number of throws you need
to make until the rst time you observe the same number on each die face (i.e., unt
Notes 12: Sampling Distributions and the Central Limit Theorem
Associated Reading: Wackerly 7, Chapter 7, Sections 1-4
This chapter will conclude the discussion of functions of random variables that
Notes 7: Multivariate Distributions: Expected Value
Associated Reading: Wackerly 7, Chapter 5, Sections 5-8 and 11
In Chapters 3 and 4, you were introduced to the expected value operator, which takes
MIL
Notes 9: Distributions of Functions of Random Variables
Associated Reading: Wackerly 7, Chapter 6, Sections 1-4
Lets start by establishing the point of this chapter. Youve conducted
M1
Notes 13: Some Basic Concepts from Information Theory
Associated Reading: e.g., Pattern Recognition and Machine Learning by Bishop, pp. 48-58
Information theory, which grew out of Claud
Mfl.
Notes 5: Commonly Used Continuous Distributions
Associated Reading: Wackerly 7, Chapter 4, Sections 4-8
In these notes, we shift from talking about discrete distributions to continuou
Notes 6: Multivariate Distributions
Associated Reading: Wackerly 7, Chapter 5, Sections 1-4
Up to now, weve concentrated on mrivan'ate probability distributions, i.e., distributions dened along any on
M1
Notes 11: Order Statistics
Associated Reading: Wackerly 7, Chapter 6, Section 7
The background: you sample n .119 r.v.s, denoted cfw_Y1, Y2, . . . , Y , from some distribution. There
Notes 15: A Very Short Introduction to Markov Processes
Associated Reading: e.g., Introduction to Probability by Bertsekas & Tsitsiklis, pp. 313-321
Lets make a series of observations of the weather:
Notes 14: Some Basic Concepts from Probabilistic Graphical Modeling
Associated Reading: e.g., Pattern Recognition and Machine Learning by Bishop, pp. 359-383
(Note, however, that we will not be follow
\j ERSSN Q
1. (8 points) You are given two r.v.s, Y1 and Y2, whose variances are V[Y1] = V[Y2] = 3 and whose covariance
is 2. You dene a new r.v. U = 2Y1 Y2. What is V[U]?
V10] = QTEQ = (7- ") 3 Z>(2)