Lect10 - Stochastic Process 12/8/2006 Lecture 10 Random...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
Stochastic Process 12/8/2006 Lecture 10 Random Process NCTUEE Summary In this lecture, I will discuss: Definition of Random Process Poisson Counting Process Random Processes in LTI System Power Spectral Density Markov Process Notation We will use the following notation rules, unless otherwise noted, to represent symbols during this course. Boldface upper case letter to represent MATRIX Boldface lower case letter to represent vector Superscript ( · ) T and ( · ) H to denote transpose and hermitian (conjugate transpose), respectively Upper case italic letter to represent RANDOM VARIABLE 10-1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Figure 1 : Illustration of a random process. 1 Random Process (1) Definition (Random Process) Let ε Ω be an outcome of the sample space Ω. Let X ( t,ε ) be a map- ping of the sample space Ω into a space of continuous time func- tions . This mapping is called a random process if at each fixed time the mapping is a random variable. A simple example: Consider X ( t,ε ) = A · cos( ω 0 t + θ ( ε )) where θ ( ε ) is a random variable. Then, X ( t,ε ) is a random process. Remarks: — A random process is a 2-variable function that evolves in time. — For a fixed outcome ε , say ε 1 , X ( t,ε 1 ) is called a sample func- tion and is a non-random ( deterministic ) function. That is, once we know what the outcome ε is, the sample function associ- ated with that ε is also determined. — For fixed t , say t 1 , X ( t 1 ) is a random variable. — If we sample the random process at N times t 1 , ··· ,t N , we form a random vector h X ( t 1 ) ,X ( t 2 ) , ··· ,X ( t N ) i T . 10-2
Background image of page 2
(2) A random process X ( t ) is statistically specified by its complete set of n th order probability distribution (or density) function F X x 1 ,x 2 , ··· ,x n ; t 1 ,t 2 , ··· ,t n · for all x 1 ,x 2 , ··· ,x n , and for all time t 1 < t 2 < ··· < t n . (3) The mean function, autocorrelation function, and autocovariance func- tion are defined as: Mean function: μ X ( t ) , E h X ( t ) i = Z -∞ xf X ( x ; t ) dx, Autocorrelation function: for all t 1 and t 2 R X ( t 1 ,t 2 ) , E h X ( t 1 ) X * ( t 2 ) i = Z -∞ Z -∞ x 1 x * 2 f X ( x 2 ,x 2 ; t 1 ,t 2 ) dx 1 dx 2 , Autocovariance function: for all t 1 and t 2 K X [ t 1 ,t 2 ] , E h ( X ( t 1 ) - μ X ( t 1 ) )( X ( t 2 ) - μ X ( t 2 ) ) * i = R XX ( t 1 ,t 2 ) - μ X ( t 1 ) μ * X ( t 2 ) . (4) Definition (Independent Increments) A random process is said to have independent increments if the set of n random variables X ( t 1 ) ,X ( t 2 ) - X ( t 1 ) , ··· ,X ( t n ) - X ( t n - 1 ) are jointly independent for t 1 < t 2 < ··· < t n and for all n > 1. 10-3
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 Examples (1) ( Asynchronous Binary Signaling ) We can model, in the absence of noise and interference, the continuous time received binary signal X ( t ) of a communication link by the ABS process X ( t ) = X n X n · w ± t - D - nT T , where X n [ - 1 , +1] equally likely, D is the unknown delay typi-
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 11/28/2010 for the course EE 301 taught by Professor Gfung during the Winter '10 term at National Chiao Tung University.

Page1 / 20

Lect10 - Stochastic Process 12/8/2006 Lecture 10 Random...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online