Chapt 8a - 8 Introduction to Random Processes 8.1...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 8 Introduction to Random Processes 8.1 Introduction 267 8.2 Classification of Random Processes 269 8.3 Characterizing a Random Process 269 8.4 Crosscorrelation and Crosscovariance Functions 272 8.5 Stationary Random Processes 275 8.6 Ergodic Random Processes 282 8.7 Power Spectral Density 284 8.8 Discrete-Time Random Processes 290 8.9 Chapter Summary 293 8.10 Problems 294 8.1 Introduction Chapters 1 to 7 were devoted to the study of probability theory. In those chapters we were concerned with outcomes of random experiments and the random vari- ables used to represent them. This chapter deals with the dynamics of probability theory. The concept of random processes enlarges the random variable concept to include time. Thus, instead of thinking of a random variable X that maps an event s S , where S is the sample space, to some number X ( s ) , we think of how the random variable maps the event to different numbers at different times. This implies that instead of the number X ( s ) we deal with X ( t , s ) , where t T and T is called the parameter set of the process and is usually a set of times. Random processes are widely encountered in such fields as communications, control, management science, and time series analysis. Examples of random 267 268 Chapter 8 Introduction to Random Processes processes include the population growth, the failure of a piece of equipment, the price of a given stock over time, and the number of calls that arrive at a switch- board. If we fix the sample point s , X ( t ) is some real function of time. For each s , we have a function X ( t ) . Thus, X ( t , s ) can be viewed as a collection of time functions, one for each sample point s , as shown in Figure 8.1. On the other hand, if we fix t , we have a function X ( s ) that depends only on s and thus is a random variable. Thus, a random process becomes a random variable when time is fixed at some particular value. With many values of t we obtain a collection of random variables. Thus, we can define a random process as a family of random variables { X ( t , s ) | t T , s S } defined over a given probability space and indexed by the time parameter t . A random process is also called a stochastic process . Consider a communi- cation system example. Assume we have a set of possible messages that can be transmitted over a channel. The set of possible messages then constitutes our sample space. For each message M generated by our source, we transmit an asso- ciated waveform X ( t , s ) over the channel. The channel is not perfect; it selectively adds a noise waveform N ( t , s ) to the original waveform so that what is seen at the receiver is a random signal R ( t , s ) that is the sum of the transmitted waveform and the noise waveform. That is, R ( t , s ) = X ( t , s ) + N ( t , s ) Figure 8.1 A Sample Random Process 8.2 Classification of Random Processes 269 Because the noise waveform is probabilistically selected by the channel, different noise waveforms can be associated not only with the same transmitted waveform...
View Full Document

Page1 / 10

Chapt 8a - 8 Introduction to Random Processes 8.1...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online