This preview shows page 1. Sign up to view the full content.
Unformatted text preview: , Random Processes, Noise Probability, Random Variables Probability is a mapping from a set of outcomes (events) of an
experiment to numbers in the interval [0, 1].
Random varialbes are mappings from outcomes to real numbers. EECS 455 (Univ. of Michigan) Fall 2012 September 7, 2012 77 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Probability, Random Variables: Example Experiment: Flip an unbiased coin 10 times.
Probability: P (HTHTTHTHHT ) = 2−10 .
Random Variable: X = number of heads. X (HTHTTHTHHT ) = 5.
P (X = 8) =
E [X ] =
E [X 2 ] = 10
8 (1/2)8 (1/2)2 . 10
k =0 kP {X =
10
2
k =0 k P {X EECS 455 (Univ. of Michigan) k} = = k} = 10
10
k
10−k .
k =0 k k (1/2) (1/2)
10
2 10 (1/2)k (1/2)10−k .
k =0 k
k Fall 2012 September 7, 2012 78 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Gaussian Random Variables
A Gaussian random variable X with mean µ and variance σ 2 has
2
2
density fX (x ) = √ 1 e−(x −µ) /(2σ )
2πσ P {X ≤ x } = Φ( x −µ )
σ P {X > x } = Q ( x −µ )
σ
fX (x ) EECS 455 (Univ. of Michigan) Fall 2012 September 7, 2012 79 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise 2 Φ(x )
1 0
5 4 3 EECS 455 (Univ. of Michigan) 2 1 0 Fall 2012 1 2 3 4 September 7, 2012 5 80 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise 2 Q (x )
1 0
5 4 3 EECS 455 (Univ. of Michigan) 2 1 0 Fall 2012 1 2 3 4 September 7, 2012 5 81 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Two independent Gaussian random variables The (weighted) sum of two independent Gaussian random
variables is a Gaussian random variable.
Two Gaussian random variables (real) are independent if they are
uncorrelated.
The joint density of two independent, identically distributed
Gaussian random variables is circularly symmetric. EECS 455 (Univ. of Michigan) Fall 2012 September 7, 2012 82 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Two dimensional Gaussian 0.2 0.1 f 1 X ,X 2 0.15 0.05 0
4
4 2
2 0 0 −2
x2 EECS 455 (Univ. of Michigan) −2
−4 −4 Fall 2012 x1
September 7, 2012 83 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Rotation of a random vector
Consider a pair of random variables nc and ns that are independent,
zero mean and jointly Gaussian each with variance σ 2 . Because they
2
2
are independent and zero mean E [nc ns ] = 0. Also E [nc ] = E [ns ] = σ 2 .
Now consider a transformation of those random variables that is just a
rotation. That is if n = (nc + jns ) and w = nej φ then
w = (wc + jws ) = (nc + jns )(cos(φ) + j sin(φ))
= nc cos(φ) − ns sin(φ) + j (ns cos(φ) + nc sin(φ)) wc = nc cos(φ) − ns sin(φ)
ws = ns cos(φ) + nc sin(φ) EECS 455 (Univ. of Michigan) Fall 2012 September 7, 2012 84 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise 3 2 n
1 ns w
φ ws
0 wc nc 1 2
3 2 EECS 455 (Univ. of Michigan) 1 0
Fall 2012 1 2 3 September 7, 2012 85 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Statistics of wc and ws
E [wc ] = E [nc cos(φ) − ns sin(φ)] = E [nc ] cos(φ) − E [ns ] sin(φ) =
E [ws ] 0 = E [ns cos(φ) + nc sin(φ)] = 0 = Var[nc cos(φ) − ns sin(φ)] = E [(nc cos(φ) − ns sin(φ)) ] = E [nc cos (φ)] − E [nc ns ] cos(φ) sin(φ) + E [ns sin (φ)] = 2
2
2
2
E [nc ] cos (φ) + E [ns ] sin (φ) = Var[wc ] E [ns ] cos(φ) + E [nc ] sin(φ) = σ cos (φ)] + σ sin (φ) 2 2 2 2 2 2 2 2 2 2 = σ = E [(nc cos(φ) − ns sin(φ))(ns cos(φ) + nc sin(φ))] = E [wc ws ] E [nc ns ] cos (φ) + E [nc ] cos(φ) sin(φ) − E [ns ] sin(φ) cos(φ) − E [nc ns ] sin (φ) 2 2 2 2 2 2 2 2 = 0 cos (φ) + σ cos(φ) sin(φ) − σ sin(φ) cos(φ) − 0 sin (φ) = 0 EECS 455 (Univ. of Michigan) Fall 2012 September 7, 2012 86 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Representation of a random vector
Now consider a representation of a random vector using a different
coordinate system. Suppose the coordinate system rotates by an
angle θ . Let u0 and u1 be the representation of the random vector in
the new coordinate system.
u0 = nc cos(θ ) + ns sin(θ )
u1 = ns cos(θ ) − nc sin(θ )
We can (in a similar manner to the previous case) calculate the mean
and the variance of u0 and u1 . Assume that nc and ns are
independent, identically distributed Gaussian random variables with
mean 0 and variance σ 2 . Then u0 and u1 are Gaussian with
E [u0 ] = E [u1 ] = 0
Var[u0 ] = Var[u1 ] = σ 2 .
EECS 455 (Univ. of Michigan) Fall 2012 September 7, 2012 87 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise 3 u1 2 1 n u0 ns θ 0 nc
1 2 3
3
EECS 455 (Univ. of Michigan) 2 1 0
Fall 2012 1 2 3
September 7, 2012 88 / 174 Lecture Notes 2 Random Variables, Random Processes, Noise Random Processes Random processes are indexed (by time) random variables.
X (t ) is a random process if for each time t , X (t ) is a random
variable.
Example X (t ) is Gaussian distributed for each t with mean 0 and
variance σ 2 and X (t ) and X (s) are independent for t = s.
We can characterize the statistical properties of these random
variables.
E [X (t )] = 0, Var[X (t )] = σ 2 , E [X 2 (t )] = σ 2 , E [X (t )X (s)] = 0 for
t = s. EECS 455 (Univ. of Mich...
View
Full
Document
This note was uploaded on 02/12/2014 for the course EECS 455 taught by Professor Stark during the Fall '08 term at University of Michigan.
 Fall '08
 Stark
 Frequency

Click to edit the document details