Register now to access 7 million high quality study materials (What's Course Hero?) Course Hero is the premier provider of high quality online educational resources. With millions of study documents, online tutors, digital flashcards and free courseware, Course Hero is helping students learn more efficiently and effectively. Whether you're interested in exploring new subjects or mastering key topics for your next exam, Course Hero has the tools you need to achieve your goals.

4 Pages

lecture-05

Course: STAT 36-754, Spring 2006
School: Michigan
Rating:

Word Count: 1000

Document Preview

5 Stationary Chapter One-Parameter Pro cesses Section 5.1 describes the three main kinds of stationarity: strong, weak, and conditional. Section 5.2 relates stationary processes to the shift operators introduced in the last chapter, and to measure-preserving transformations more generally. 5.1 Kinds of Stationarity Stationary processes are those which are, in some sense, the same at dierent times slightly...

Register Now

Unformatted Document Excerpt

Coursehero >> Michigan >> Michigan >> STAT 36-754

Course Hero has millions of student submitted documents similar to the one
below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.

Course Hero has millions of student submitted documents similar to the one below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.
5 Stationary Chapter One-Parameter Pro cesses Section 5.1 describes the three main kinds of stationarity: strong, weak, and conditional. Section 5.2 relates stationary processes to the shift operators introduced in the last chapter, and to measure-preserving transformations more generally. 5.1 Kinds of Stationarity Stationary processes are those which are, in some sense, the same at dierent times slightly more formally, which are invariant under translation in time. There are three particularly important forms of stationarity: strong or strict, weak, and conditional. Denition 49 (Strong Stationarity) A one-parameter process is strongly stationary or strictly stationary when al l its nite-dimensional distributions are invariant under trnaslation of the indices. That is, for al l T , and al l J Fin(T ), L (XJ ) = L (XJ + ) (5.1) Notice that when the parameter is discrete, we can get away with just checking the distributions of blocks of consecutive indices. Denition 50 (Weak Stationarity) A one-parameter process is weakly stationary or second-order stationary when, for al l t T , E [Xt ] and for al l t, T , E [X X +t ] = E [X0 ] = E [X0 Xt ] 23 (5.2) (5.3) CHAPTER 5. STATIONARY PROCESSES 24 At this point, you should check that a weakly stationary process has timeinvariant correlations. (We will say much more about this later.) You should also check that strong stationarity implies weak stationarity. It will turn out that weak and strong stationarity coincide for Gaussian processes, but not in general. Denition 51 (Conditional (Strong) Stationarity) A one-parameter process is conditional ly stationary if its conditional distributions are invariant under time-translation: n N, for every set of n + 1 indices t1 , . . . tn+1 T , ti < ti+1 , and every shift , L Xtn+1 |Xt1 , Xt2 . . . Xtn = L Xtn+1 + |Xt1 + , Xt2 + . . . Xtn + (5.4) (a.s.). Strict stationarity implies conditional stationarity, but the converse is not true, in general. (Homogeneous Markov processes, for instance, are all conditionally stationary, but most are not stationary.) Many methods which are normally presented using strong stationarity can be adapted to processes which are merely conditionally stationary.1 Strong stationarity will play an important role in what follows, because it is the natural generaliation of the IID assumption to situations with dependent variables we allow for dependence, but the probabilistic set-up remains, in a sense, unchanging. This will turn out to be enough to let us learn a great deal about the process from observation, just as in the IID case. 5.2 Strictly Stationary Pro cesses and MeasurePreserving Transformations The shift-operator representation of Section 4.2 is particularly useful for strongly stationary processes. Theorem 52 A process X with measure is strongly stationary if and only if is shift-invariant, i.e., = 1 for al l in the time-evolution semi-group. Proof: If (invariant distributions imply stationarity): For any nite collec tion of indices J , L (XJ ) = J 1 (Lemma 25), and similarly L (XJ + ) = 1 J + . J + 1 J + 1 J + L (XJ + ) 1 For = J = 1 J 1 = 1 J 1 J 1 = = L (XJ ) more on conditional stationarity, see Caires Ferreira and (2005). (5.5) (5.6) (5.7) (5.8) (5.9) CHAPTER 5. STATIONARY PROCESSES 25 Only if : The statement that = 1 really means that, for any set A X T , (A) = ( 1 A). Suppose A is a nite-dimensional cylinder set. Then the equality holds, because all the nite-dimensional distributions agree (by hypothesis). But this means that X and X are two processes with the same nite-dimensional distributions, and so their innite-dimensional distributions agree (Theorem 23), and the equality holds on all measurable sets A. This can be generalized somewhat. Denition 53 (Measure-Preserving Transformation) A measurable mapping F from a measurable space , X into itself preserves measure i, A X , d (A) = (F 1 A), i.e., i = F 1 . This is true just when F (X ) = X , when X is a -valued random variable with distribution . We wil l often say that F is measure-preserving, without qualication, when the context makes it clear which measure is meant. Remark on the denition. It is natural to wonder why we write the dening property as = F 1 , rather than = F . There is actually a subtle dierence, and the former is stronger than the latter. To see this, unpack the statements, yielding respectively A X , (A) A X , (A) = (F 1 (A)) = (F (A)) (5.10) (5.11) To see that Eq. 5.10 implies Eq. 5.11, pick any measurable set B , and then apply 5.10 to F (B ) (which is X , because F is measurable). To go the other way, from 5.11 to 5.10, it would have to be the case that, A X , B X such that A = F (B ), i.e., every measurable set would have to be the image, under F , of another measurable set. This is not necessarily the case; it would require, for starters, that F be onto (surjective). Theorem 52 says that every stationary process can be represented by a measure-preserving transformation, namely the shift. Since measure-preserving transformations arise in many other ways, however, it is useful to know about the processes they generate. Corollary 54 If F is a measure-preserving transformation on and X is a valued random variable, then the sequence F n (X ), n N is strongly stationary. Proof: Consider shifting the sequence F n (X ) by one: the nth term in the shifted sequence is F n+1 (X ) = F n (F (X )). But since L (F (X )) = L (X ), by hypothesis, L F n+1 (X ) = L (F n (X )), and the measure is shift-invariant. So, by Theorem 52, the process F n (X ) is stationary. Exercise 5.1 (Functions of Stationary Pro cesses) Use Corol lary 54 to show that if g is any measurable function on , then the sequence g (F n (X )) is also stationary. CHAPTER 5. STATIONARY PROCESSES 26 Exercise 5.2 (Continuous Measure-Preserving Families of Transformations) Let Ft , t R+ , be a semi-group of measure-preserving transformations, with F0 being the identity. Prove the analog of Corol lary 54, i.e., that Ft (X ), t R+ , is a stationary process. Exercise 5.3 (The Logistic Map as an M.P.T.) The logistic map with a = 4 is a measure-preserving transformation, and the measure it preserves has the density 1/ x(1 x) (on the unit interval). 1. Verify that this density is invariant under the action of the logistic map. 2. Simulate the logistic map with uniformly distributed X0 . What happens to the density of Xt as t ?
Find millions of documents on Course Hero - Study Guides, Lecture Notes, Reference Materials, Practice Exams and more. Course Hero has millions of course specific materials providing students with the best way to expand their education.

Below is a small sample set of documents:

Michigan - STAT - 36-754
Chapter 7Continuity of Sto chasticPro cessesSection 7.1 describes the leading kinds of continuity for stochasticprocesses, which derive from the modes of convergence of randomvariables. It also denes the idea of versions of a stochastic process.Sect
Michigan - STAT - 36-754
Chapter 8More on ContinuitySection 8.1 constructs separable modications of reasonable butnon-separable random functions, and explains how separability relates to non-denumerable properties like continuity.Section 8.2 constructs versions of our favorit
Michigan - STAT - 36-754
Chapter 9Markov Pro cessesThis lecture begins our study of Markov processes.Section 9.1 is mainly ideological: it formally denes the Markovproperty for one-parameter processes, and explains why it is a natural generalization of both complete determini
Michigan - STAT - 36-754
Chapter 10Alternate Characterizationsof Markov Pro cessesThis lecture introduces two ways of characterizing Markov processes other than through their transition probabilities.Section 10.1 addresses a question raised in the last class, aboutwhen being
Michigan - STAT - 36-754
Chapter 11Markov ExamplesSection 11.1 nds the transition kernels for the Wiener process,as an example of how to manipulate such things.Section 11.2 looks at the evolution of densities under the actionof the logistic map; this shows how deterministic
Michigan - STAT - 36-754
Chapter 12Generators of MarkovPro cessesThis lecture is concerned with the innitessimal generator of aMarkov process, and the sense in which we are able to write the evolution operators of a homogeneous Markov process as exponentialsof their generato
Michigan - STAT - 36-754
Chapter 13The Strong MarkovProp erty and MartingaleProblemsSection 13.1 introduces the strong Markov property independence of the past and future conditional on the state at random(optional) times.Section 13.2 describes the martingale problem for Ma
Michigan - STAT - 36-754
Chapter 14Feller ProcessesSection 14.1 fullls the demand, made last time, for an exampleof a Markov process which is not strongly Markovian.Section 14.2 makes explicit the idea that the transition kernelsof a Markov process induce a kernel over sampl
Michigan - STAT - 36-754
Chapter 15Convergence of FellerPro cessesThis chapter looks at the convergence of sequences of Feller processes to a limiting process.Section 15.1 lays some ground work concerning weak convergenceof processes with cadlag sample paths.Section 15.2 st
Michigan - STAT - 36-754
Chapter 16Convergence of RandomWalksThis lecture examines the convergence of random walks to theWiener process. This is very important both physically and statistically, and illustrates the utility of the theory of Feller processes.Section 16.1 nds t
Michigan - STAT - 36-754
Chapter 17Diusions and the WienerPro cessSection 17.1 introduces the ideas which will occupy us for thenext few lectures, the continuous Markov processes known as diusions, and their description in terms of stochastic calculus.Section 17.2 collects s
Michigan - STAT - 36-754
Chapter 18Stochastic Integrals withthe Wiener Pro cessSection 18.1 addresses an issue which came up in the last lecture,namely the martingale characterization of the Wiener process.Section 18.2 gives a heuristic introduction to stochastic integrals,
Michigan - STAT - 36-754
Chapter 20More on Sto chasticDierential EquationsSection 20.1 shows that the solutions of SDEs are diusions, andhow to nd their generators. Our previous work on Feller processesand martingale problems pays o here. Some other basic propertiesof solut
Michigan - STAT - 36-754
Chapter 22Large Deviations forSmall-Noise Sto chasticDierential EquationsThis lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a rst taste of large deviationstheory. Here we study the divergence between
Michigan - STAT - 36-754
Chapter 24The Almost-Sure Ergo dicTheoremThis chapter proves Birkho s ergodic theorem, on the almostsure convergence of time averages to expectations, under the assumption that the dynamics are asymptotically mean stationary.This is not the usual proo
Michigan - STAT - 36-754
Chapter 25Ergo dicityThis lecture explains what it means for a process to be ergodicor metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces some consequences.The most important one is that sa
Michigan - STAT - 36-754
Chapter 26Decomp osition ofStationary Pro cesses intoErgo dic Comp onentsThis chapter is concerned with the decomposition of asymptoticallymean-stationary processes into ergodic components.Section 26.1 shows how to write the stationary distribution a
Michigan - STAT - 36-754
Chapter 27MixingA stochastic process is mixing if its values at widely-separatedtimes are asymptotically independent.Section 27.1 denes mixing, and shows that it implies ergodicity.Section 27.2 gives some examples of mixing processes, both determinis
Michigan - STAT - 36-754
Chapter 28Shannon Entropy andKullback-LeiblerDivergenceSection 28.1 introduces Shannon entropy and its most basic properties, including the way it measures how close a random variable isto being uniformly distributed.Section 28.2 describes relative
Michigan - STAT - 36-754
Chapter 29Entropy Rates andAsymptotic EquipartitionSection 29.1 introduces the entropy rate the asymptotic entropy per time-step of a stochastic process and shows that it iswell-dened; and similarly for information, divergence, etc. rates.Section 29.
Michigan - STAT - 36-754
Chapter 30General Theory of LargeDeviationsA family of random variables follows the large deviations principle if the probability of the variables falling into bad sets, representing large deviations from expectations, declines exponentially insome ap
Michigan - STAT - 36-754
Chapter 31Large Deviations for I IDSequences: The Return ofRelative EntropySection 31.1 introduces the exponential version of the Markov inequality, which will be our ma jor calculating device, and shows howit naturally leads to both the cumulant gen
Michigan - STAT - 36-754
Chapter 32Large Deviations forMarkov SequencesThis chapter establishes large deviations principles for Markovsequences as natural consequences of the large deviations principlesfor IID sequences in Chapter 31. (LDPs for continuous-time Markovprocess
Michigan - STAT - 36-754
Chapter 34Large Deviations forWeakly Dep endentSequences: TheGrtner-Ellis TheoremaThis chapter proves the Grtner-Ellis theorem, establishing anaLDP for not-too-dependent processes taking values in topologicalvector spaces. Most of our earlier LDP
Michigan - STAT - 36-754
Chapter 35Large Deviations forStochastic DierentialEquationsThis last chapter revisits large deviations for stochastic dierential equations in the small-noise limit, rst raised in Chapter 22.Section 35.1 establishes the LDP for the Wiener process (Sc
Michigan - STAT - 36-754
BibliographyAbramowitz, Milton and Irene A. Stegun (eds.) (1964). Handbook of Mathematical Functions . Washington, D.C.: National Bureau of Standards. URLhttp:/www.math.sfu.ca/cbm/aands/.Algoet, Paul (1992). Universal Schemes for Prediction, Gambling a
Michigan - STAT - 36-754
Solution to Homework #1, 36-75427 January 2006Exercise 1.1 (The product -eld answers countable questions)Let D = S X S , where the union ranges over all countable subsets S of the index set T . For any event D D, whether or not asample path x D depend
Michigan - STAT - 36-754
Solution to Homework #2, 36-7547 February 2006Exercise 5.3 (The Logistic Map as a MeasurePreserving Transformation)The logistic map with a = 4 is a measure-preserving transformation, and the measure it preserves has the density 1/ x (1 x)(on the unit
Michigan - STAT - 36-754
Solution to Homework #3, 36-75425 February 2006Exercise 10.1I need one last revision of the denition of a Markov operator: a linear operatoron L1 satisfying the following conditions.1. If f 0 (-a.e.), then Kf 0 (-a.e.).2. If f M (-a.e.), then Kf M (
Michigan - STAT - 36-754
Syllabus for Advanced Probability II,Stochastic Processes36-754Cosma ShaliziSpring 2006This course is an advanced treatment of interdependent random variablesand random functions, with twin emphases on extending the limit theoremsof probability fro
George Mason - STAT - 344
Introduction to Engineering StatisticsLecture 02 TopicsCollecting engineering dataMechanistic and empirical modelsProbability and probability modelsLecture 02 Reference:Montgomery: Sec 1.2 through 1.41Basic Types of StudiesThree basic methods for
George Mason - STAT - 344
Probability ALecture 03 TopicsRandom experimentsSample spacesEventsCounting techniquesLecture 03 Reference:Montgomery: Sec 2.112ProbabilityCHAPTER OUTLINE2-1 Sample Spaces &amp; Events2-1.1 Random Experiments2-1.2 Sample Spaces2-1.3 Events2-1.
George Mason - STAT - 344
Probability BLecture 04 TopicsEqually likely outcomesProbability rulesUnions, intersections &amp; complementsSet operationsConditional probabilities in treesLecture 04 Reference:Montgomery:Sec 2.2 Axioms of ProbabilitySec 2.3 Addition rulesSec 2.4
George Mason - STAT - 344
Probability CLecture 05 TopicsMultiplication ruleTotal probability ruleIndependence of eventsReliabilityBayes TheoremRandom variablesLecture 05 Reference:Montgomery:Sec 2.5Sec 2.6Sec 2.7Sec 2.8Multiplication, total probability rulesIndepend
George Mason - STAT - 344
Discrete Probability ALecture 06 TopicsDiscrete random variables, defined &amp; graphedCumulative distribution functions, defined &amp;graphedMean and variance of a discrete random variableDefined mathematicallyGraphically explainedLecture 06 Reference:M
George Mason - STAT - 344
Discrete Probability BLecture 07 TopicsFor each of these distributions, we will examine the:Graph and parametersProbability mass and cumulative distribution functionsMean and varianceUniform distributionBinomial distribution:Negative binomial dist
George Mason - STAT - 344
Discrete Probability CLecture 08 TopicsFor each of these distributions, we will examine the:Graph and parametersProbability mass and cumulative distribution functionsMean and varianceHypergeometric distributionPoisson distributionLecture 08 Refere
George Mason - STAT - 344
Probability &amp; Statistics forEngineers/Scientists ILecture 01 TopicsIntroduction to the Syllabus, Assignment SheetBlackboard for course materials, lecture notesIntroduction to the instructorBasic ideas in statisticsIllustration of computer tools RL
George Mason - STAT - 344
Continuous Probability ALecture 09 TopicsContinuous variable distribution propertiesPDF &amp; CDF functions and graphsDerivation of the mean and varianceDesign and uses of the uniform distributionLecture 09 Reference:Montgomery:Sec 4.1Sec 4.2Sec 4.3
George Mason - STAT - 344
Continuous Probability BLecture 10 TopicsNormal distribution graphs and parametersStandard normal calculation, table and softwareApproximating discrete distributions with the normalExponential distributionFormula, graphs and parameterApplicationsL
George Mason - STAT - 344
Continuous Probability CLecture 11 TopicsBuilding on the exponential distribution of prior lectureMotivation, formula, graph, parameters andapplications of the:Erlang distribution and its extension, the gamma distributionWeibull distributionLognorm
George Mason - STAT - 344
Joint Probability Distributions ALecture 12 TopicsBuilding on the exponential distribution of prior lectureMotivation, formula, graph, parameters andapplications of the:Erlang distribution and its extension, the gamma distributionWeibull distributio
George Mason - STAT - 344
Joint Probability Distributions BLecture 13 TopicsPairwise independent random variablesRectangular ranges are necessary, but not sufficientFinding these probability distributions (&gt; 2 dimensions)Joint, marginal and conditional distributionsIndepende
George Mason - STAT - 344
Joint Probability Distributions CLecture 14 TopicsDiscrete multinomial distributionContinuous bivariate normal distributionIndependentDependent (covariance &amp; correlation)Reproductive propertyLinear combinations of random variablesSums and averages
George Mason - STAT - 344
General Bivariate Continuous DistributionsThis continuous variable example illustrates1) Finding the marginal and conditional for the two variables andcorresponding expected values, variances, and standarddeviations.2) Finding general conditional dis
George Mason - STAT - 344
Bivariate Discrete DistributionsLet X and Y be two discrete random variables defined on a samplespace S of an experiment.The joint probability mass function p(x, y) is defined for each pair ofnumbers (x, y) byIn this class the pairs of numbers can be
George Mason - STAT - 344
Gamma DistributionThe gamma distribution with parameters r and can be thought of asthe waiting time for r Poisson events when r is integer. The parameteris the expected number of Poisson events per a unit time interval. Ifincrease the typical wait for
George Mason - STAT - 344
Review:MarginalandConditionalDistributionsandCovarianceforContinuousDistributionsManytopicsinthetextbeginwithgeneralcaseexamplesandthencallattentiontofamiliesofdistribution,especiallythenormalfamily.Thefollowingusesapolynomialdensityfortworandomvariabl
George Mason - STAT - 344
Midterm 2 Overview by ChapterChapter 4 Continuous distributionsFamilies: Identification, domains, expected value variance: See SummaryProbability problems:R script: Normal Distribution, Exponential Distribution, Gamma DistributionHand integration: Si
George Mason - STAT - 344
1. Probability Density Functions from Chapter 4.In the Midterm exam, some density functions will be provided. You may be asked to fill in anyof the additional information: the family names, the domain possible values, and the expectedvalue and variance
George Mason - STAT - 344
Analysis of Paired DataThe Paired t TestThe sample consists of n independently selected items for which a pairof observations is made.We can compute the difference for each pairs and make inferencesabout the mean of these differences using a one samp
George Mason - STAT - 344
Data Type, Population Parameters and R Functionsfor Hypothesis Test and Confidence IntervalsSingle Population InferenceDataParameterR functionCount or fractionProportion pbinom.testof n itemsin class of interestContinuousMean t.testPaired co
George Mason - STAT - 344
Inference about a Difference BetweenPopulation ProportionsExample problem:Olestra was a fat substitute used in some snack foods.After some people consuming such snacks reported gastrointestinalproblems an experiment was performed.Results:90 of 563
George Mason - STAT - 344
Interpreting R Hypothesis Test and Confidence Interval OutputProblems are worth .5 points each. There are 50 problems.Directions: Most answers are very short. Round many digits answers to 2 significant digits.Write neatly giving the problem number and
George Mason - STAT - 344
Interpreting R Hypothesis Test OutputIn writing numeric values for answers, round to 3 significant digits.1.Exact binomial testdata: 12 and 24number of successes = 12, number of trials = 24, p-value = 0.03139alternative hypothesis: true probability
George Mason - STAT - 344
George Mason - STAT - 344
R Inputx = c( 25.8, 36.6, 26.3, 21.8, 27.2)t.test( x, alternative=&quot;greater&quot;, mu=25, conf.level=.95)R OutputOne Sample t-testdata: xt = 1.0382, df = 4, p-value = 0.1789alternative hypothesis: true mean is greater than 2595 percent confidence interv
George Mason - STAT - 344
Concepts of Point EstimationLecture 18 (former 17) Topics Basic properties of a confidence interval Large-sample confidence intervalsPopulation mean for measurement dataPopulation proportion for categorical data Bootstrap confidence intervals ignore
George Mason - STAT - 344
Confidence IntervalsLecture 20 TopicsVariancesProportionsPrediction intervalsLecture 19 Reference:Montgomery Sections 9-1 thru 9-3Devore Lecture 20Devore Lecture 211Hypothesis and Test ProceduresLecture 20 TopicsHypothesis tests versus confide
George Mason - STAT - 344
Risks and P-ValuesLecture 21and 22 TopicsType II errors risksP-ValuesLecture 21 Reference:Montgomery Sections 9-4, 9-1Excel WSReviewedStat 344 Lecture 221 RisksGo to file: Stat 344 Lecture 21 WSconcerning the interaction of theseinterrelate
George Mason - STAT - 344
dcfeae7461006edd771c0bf8ba9d38963497f08b.xlsDr. SimsIllustration of Defined Alternative HypothesisInput DataH0: =75H1: ==n==7491000.01Output DataIntermediate Calcs7070.470.871.271.67272.472.873.273.67474.474.875.275.67676.4