This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE378 Statistical Signal Processing Lecture 1  04/04/2007 Introductory Lecture Lecturer: Tsachy Weissman Scribe: Tsachy Weissman Whether phenomena encountered in the world around us are truly random, or randomness is just a way of expressing our limited understanding of deterministic mechanisms that govern these phenomena, is a question we are not going to get into (and can leave to philosophers of science and quantum physicists). Whats indisputable is the fact that modeling many of these phenomena as random processes has been instrumental in ways that cannot be overstated, in the previous and current centuries, in all fields of engi neering, as well as the physical, biological, and social sciences. This course is dedicated to the study of various aspects of random processes, insofar as they are in strumental in providing guidelines, algorithms, and performance guarantees for some prototypical signal processing problems arising in electrical engineering and related fields. 1 Outline Lets start with a rough outline of the kind of questions we will consider. 1.1 Characterization What is a (discretetime) random signal, or process, and what does it take to specify it ? For example, the Kolmogorov extension theorem asserts that it suffices to specify the finite dimensional distributions. 1.2 Properties of Random Signals Examples of properties: Stationarity (strict and weak) Ergodicity: to what extent is the observed process, or signal, reliable in the sense that we can infer information about various parameters governing it based on our observations? Gaussianity: Gaussian processes are fundamental for several reasons. For one thing, they realistically arise as natural models of many phenomena due to CLT considerations. Also, in the absence of more than 2ndorder statistics, they often correspond to a worstcase, conservative, model. No less importantly, the Gaussianity assumption often allows to obtain optimal schemes in closed form. Markovity. Martingale property: when the conditional expected value of an observation at time t , given all the observations up to some earlier time s , is equal to the observation at that earlier time s . Though traditionally given less attention in engineering classes, this property is no less fundamental than the Markov property. Many processes arising naturally in many fields are martingales. No less importantly, martingales often arise in analyzing the performance of various signal processing schemes. We will study such properties and some of their implications. 1 1.3 Representation of Random Processes Examples: Spectral representation: Suppose X ( n ) is wide sense stationary (WSS) with autocovariance r ( k ). The spectral density is F ( { r ( k ) } ) X k r ( k ) e jwk = S XX ( e jw ) , which is a spectral representation of the autocovariance. We also have the inverse relationship r ( k ) = F 1 [ S XX ( e jw )]( k ) = 1 2 Z  S XX ( e jw ) e jwk dw....
View
Full
Document
This note was uploaded on 03/03/2011 for the course EE 378 taught by Professor Weissman,i during the Spring '07 term at Stanford.
 Spring '07
 Weissman,I
 Signal Processing

Click to edit the document details