# Register now to access 7 million high quality study materials (What's Course Hero?) Course Hero is the premier provider of high quality online educational resources. With millions of study documents, online tutors, digital flashcards and free courseware, Course Hero is helping students learn more efficiently and effectively. Whether you're interested in exploring new subjects or mastering key topics for your next exam, Course Hero has the tools you need to achieve your goals.

4 Pages

### lecture-11

Course: STAT 36-754, Spring 2006
School: Michigan
Rating:

Word Count: 967

#### Document Preview

11 Markov Chapter Examples Section 11.1 nds the transition kernels for the Wiener process, as an example of how to manipulate such things. Section 11.2 looks at the evolution of densities under the action of the logistic map; this shows how deterministic dynamical systems can be brought under the sway of the theory weve developed for Markov processes. 11.1 Transition Kernels for the Wiener Pro cess We have...

Register Now

#### Unformatted Document Excerpt

Coursehero >> Michigan >> Michigan >> STAT 36-754

Course Hero has millions of student submitted documents similar to the one
below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.

Course Hero has millions of student submitted documents similar to the one below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.
11 Markov Chapter Examples Section 11.1 nds the transition kernels for the Wiener process, as an example of how to manipulate such things. Section 11.2 looks at the evolution of densities under the action of the logistic map; this shows how deterministic dynamical systems can be brought under the sway of the theory weve developed for Markov processes. 11.1 Transition Kernels for the Wiener Pro cess We have previously dened the Wiener process (Examples 38 and 78) as the real-valued process on R+ with the following properties: 1. W (0) = 0; |= 2. For any three times t1 t2 t3 , W (t3 ) W (t2 ) W (t2 ) W (t1 ) (independent increments); 3. For any two times t1 t2 , W (t2 ) W (t1 ) N (0, t2 t1 ) (Gaussian increments); 4. Continuous sample paths (in the sense of Denition 72). Here we will use the Gaussian increment property to construct a transition kernel, and then use the independent increment property to show that these keernels satisfy the Chapman-Kolmogorov equation, and hence that there exist Markov processes with the desired nite-dimensional distributions. 60 CHAPTER 11. MARKOV EXAMPLES 61 First, notice that the Gaussian increments property gives us the transition probabilities: P (W (t2 ) B |W (t1 ) = w1 ) = P (W (t2 ) W (t1 ) B w1 ) u2 1 e 2(t2 t1 ) = du 2 (t2 t1 ) B w1 (w w1 )2 1 2 = dw2 e 2(t2 t1 ) 2 (t2 t1 ) B t1 ,t2 (w1 , B ) (11.1) (11.2) (11.3) (11.4) To show that W (t) is a Markov process, we must show that, for any three times t1 t2 t3 , t1 ,t2 t2 ,t3 = t1 ,t3 . Notice that W (t3 ) W (t1 ) = (W (t3 ) W (t2 )) + (W (t2 ) W (t1 )). Because increments are independent, then, W (t3 ) W (t1 ) is the sum of two independent random variables, W (t3 ) W (t2 ) and W (t2 ) W (t1 ). The distribution of W (t3 ) W (t1 ) is then the convolution of distributions of W (t3 ) W (t2 ) and W (t2 ) W (t1 ). Those are N (0, t3 t2 ) and N (0, t2 t1 ) respectively. The convolution of two Gaussian distributions is a third Gaussian, summing their parameters, so according to this argument, we must have W (t3 ) W (t1 ) N (0, t3 t1 ). But this is precisely what we should have, by the Gaussian-increments property. Since the trick we used above to get the transition kernel from the increment distribution can be applied again, we conclude that t1 ,t2 t2 ,t3 = t1 ,t3 and the Chapman-Kolmogorov property is satised; therefore (Theorem 103), W (t) is a Markov process (with respect to its natural ltration). To see that W (t) has, or can be made to have, continuous sample paths, invoke Theorem 94. 11.2 Probability Densities in the Logistic Map Lets revisit the rst part of Exercise 5.3, from the point of view of what we now know about Markov processes. The exercise asks us to show that the density 1 is invariant under the action of the logistic map with a = 4. x(1x) Lets write the mapping as F (x) = 4x (1 x). Solving a simple qu atic adr equation gives us the fact that F 1 (x) is the set 1 1 1 x , 1 1 + 1 x . 2 2 Notice, for later use, that the two add solutions up to 1. Notice also that F 1 ([0, x]) = 0, 1 1 1 x 1 1 + 1 x , 1 . Now we consider P (Xn+1 x), 2 2 62 CHAPTER 11. MARKOV EXAMPLES the cumulative distribution function of Xn+1 . P (Xn+1 x) = P (Xn+1 [0, x]) = P Xn F 1 ([0, x]) 1 1 1 1x 1 + 1 x ,1 = P Xn 0, 2 2 1 1 1x) 1 2( = n (y ) dy + n (y ) dy 1 0 2 (1+ 1x) (11.5) (11.6) (11.7) (11.8) where n is the density of Xn . So we have an integral equation for the evolution of the density, x n+1 (y ) dy 0 1 2 = (1 1x) n (y ) dy + 0 1 1 2 (1+ 1x) n (y ) dy (11.9) This sort of integral equation is complicated to solve directly. Instead, take the derivative of both sides with respect to x; we can do this through the fundamental theorem of calculus. On the left hand side, this will just give n+1 (x), the density we want. n+1 (x) = d dx 1 2 (1 1x) (11.10) d dx 1 n (y ) dy (1+ 1x) 1 d1 = n 1 1x 1 1x 2 dx 2 1 d1 n 1+ 1x 1+ 1x 2 dx 2 1 1 1 n = 1 1 x + n 1+ 1x 2 2 4 1x 0 n (y ) dy + 1 2 (11.11) (11.12) Notice that this denes a linear operator taking densities to densities. (You should verify the linearity.) In fact, this is a Markov operator, by the terms of Denition 113. Markov operators of this sort, derived from deterministic maps, are called Perron-Frobenius or Frobenius-Perron operators, and accordingly denoted by P . Thus an invariant density is a such that = P . All the 63 CHAPTER 11. MARKOV EXAMPLES problem asks us to do is to verify that 1 x(1x) is such a solution. 1 1 1x 2 (11.13) 1 1 1x 2 = 1 1 1 1x 2 = 1 1 1 1 1x 1+ 1x 2 2 = 2 x 1 1/2 1/2 (11.14) (11.15) Since (x) = (1 x), it follows that P = = 1 2 4 1x 1 x (1 x) = 1 1 1x 2 (11.16) (11.17) (11.18) as desired. By Lemma 117, for any distribution , P n P n is a non-increasing function of n. However, P n = , so the iterates of any distribution, under the map, approach the invariant distribution monotonically. It would be very handy if we could show that any initial distribution eventually converged on , i.e. that P n 0. When we come to ergodic theory, we will see conditions under which such distributional convergence holds, as it does for the logistic map, and learn how such convergence in distribution is connected to both pathwise convergence properties, and to the decay of correlations. 11.3 Exercises Exercise 11.1 (Brownian Motion with Constant Drift) Consider a process X (0) which, like the Wiener process, has X (0) = 0 and independent increments, but where X (t2 ) X (t1 ) N (a(t2 t1 ), 2 (t2 t1 )). a is cal led the drift rate and 2 the diusion constant. Show that X (t) is a Markov process, fol lowing the argument for the standard Wiener process (a = 0, 2 = 1) above. Do such processes have continuous modications for al l (nite) choices of a and 2 ? If so, prove it; if not, give at least one counter-example. Exercise 11.2 (Perron-Frob enius Op erators) Verify that P dened in the section on the logistic map above is a Markov operator.
Find millions of documents on Course Hero - Study Guides, Lecture Notes, Reference Materials, Practice Exams and more. Course Hero has millions of course specific materials providing students with the best way to expand their education.

Below is a small sample set of documents:

Michigan - STAT - 36-754
Chapter 12Generators of MarkovPro cessesThis lecture is concerned with the innitessimal generator of aMarkov process, and the sense in which we are able to write the evolution operators of a homogeneous Markov process as exponentialsof their generato
Michigan - STAT - 36-754
Chapter 13The Strong MarkovProp erty and MartingaleProblemsSection 13.1 introduces the strong Markov property independence of the past and future conditional on the state at random(optional) times.Section 13.2 describes the martingale problem for Ma
Michigan - STAT - 36-754
Chapter 14Feller ProcessesSection 14.1 fullls the demand, made last time, for an exampleof a Markov process which is not strongly Markovian.Section 14.2 makes explicit the idea that the transition kernelsof a Markov process induce a kernel over sampl
Michigan - STAT - 36-754
Chapter 15Convergence of FellerPro cessesThis chapter looks at the convergence of sequences of Feller processes to a limiting process.Section 15.1 lays some ground work concerning weak convergenceof processes with cadlag sample paths.Section 15.2 st
Michigan - STAT - 36-754
Chapter 16Convergence of RandomWalksThis lecture examines the convergence of random walks to theWiener process. This is very important both physically and statistically, and illustrates the utility of the theory of Feller processes.Section 16.1 nds t
Michigan - STAT - 36-754
Chapter 17Diusions and the WienerPro cessSection 17.1 introduces the ideas which will occupy us for thenext few lectures, the continuous Markov processes known as diusions, and their description in terms of stochastic calculus.Section 17.2 collects s
Michigan - STAT - 36-754
Chapter 18Stochastic Integrals withthe Wiener Pro cessSection 18.1 addresses an issue which came up in the last lecture,namely the martingale characterization of the Wiener process.Section 18.2 gives a heuristic introduction to stochastic integrals,
Michigan - STAT - 36-754
Chapter 20More on Sto chasticDierential EquationsSection 20.1 shows that the solutions of SDEs are diusions, andhow to nd their generators. Our previous work on Feller processesand martingale problems pays o here. Some other basic propertiesof solut
Michigan - STAT - 36-754
Chapter 22Large Deviations forSmall-Noise Sto chasticDierential EquationsThis lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a rst taste of large deviationstheory. Here we study the divergence between
Michigan - STAT - 36-754
Chapter 24The Almost-Sure Ergo dicTheoremThis chapter proves Birkho s ergodic theorem, on the almostsure convergence of time averages to expectations, under the assumption that the dynamics are asymptotically mean stationary.This is not the usual proo
Michigan - STAT - 36-754
Chapter 25Ergo dicityThis lecture explains what it means for a process to be ergodicor metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces some consequences.The most important one is that sa
Michigan - STAT - 36-754
Chapter 26Decomp osition ofStationary Pro cesses intoErgo dic Comp onentsThis chapter is concerned with the decomposition of asymptoticallymean-stationary processes into ergodic components.Section 26.1 shows how to write the stationary distribution a
Michigan - STAT - 36-754
Chapter 27MixingA stochastic process is mixing if its values at widely-separatedtimes are asymptotically independent.Section 27.1 denes mixing, and shows that it implies ergodicity.Section 27.2 gives some examples of mixing processes, both determinis
Michigan - STAT - 36-754
Chapter 28Shannon Entropy andKullback-LeiblerDivergenceSection 28.1 introduces Shannon entropy and its most basic properties, including the way it measures how close a random variable isto being uniformly distributed.Section 28.2 describes relative
Michigan - STAT - 36-754
Chapter 29Entropy Rates andAsymptotic EquipartitionSection 29.1 introduces the entropy rate the asymptotic entropy per time-step of a stochastic process and shows that it iswell-dened; and similarly for information, divergence, etc. rates.Section 29.
Michigan - STAT - 36-754
Chapter 30General Theory of LargeDeviationsA family of random variables follows the large deviations principle if the probability of the variables falling into bad sets, representing large deviations from expectations, declines exponentially insome ap
Michigan - STAT - 36-754
Chapter 31Large Deviations for I IDSequences: The Return ofRelative EntropySection 31.1 introduces the exponential version of the Markov inequality, which will be our ma jor calculating device, and shows howit naturally leads to both the cumulant gen
Michigan - STAT - 36-754
Chapter 32Large Deviations forMarkov SequencesThis chapter establishes large deviations principles for Markovsequences as natural consequences of the large deviations principlesfor IID sequences in Chapter 31. (LDPs for continuous-time Markovprocess
Michigan - STAT - 36-754
Chapter 34Large Deviations forWeakly Dep endentSequences: TheGrtner-Ellis TheoremaThis chapter proves the Grtner-Ellis theorem, establishing anaLDP for not-too-dependent processes taking values in topologicalvector spaces. Most of our earlier LDP
Michigan - STAT - 36-754
Chapter 35Large Deviations forStochastic DierentialEquationsThis last chapter revisits large deviations for stochastic dierential equations in the small-noise limit, rst raised in Chapter 22.Section 35.1 establishes the LDP for the Wiener process (Sc
Michigan - STAT - 36-754
BibliographyAbramowitz, Milton and Irene A. Stegun (eds.) (1964). Handbook of Mathematical Functions . Washington, D.C.: National Bureau of Standards. URLhttp:/www.math.sfu.ca/cbm/aands/.Algoet, Paul (1992). Universal Schemes for Prediction, Gambling a
Michigan - STAT - 36-754
Solution to Homework #1, 36-75427 January 2006Exercise 1.1 (The product -eld answers countable questions)Let D = S X S , where the union ranges over all countable subsets S of the index set T . For any event D D, whether or not asample path x D depend
Michigan - STAT - 36-754
Solution to Homework #2, 36-7547 February 2006Exercise 5.3 (The Logistic Map as a MeasurePreserving Transformation)The logistic map with a = 4 is a measure-preserving transformation, and the measure it preserves has the density 1/ x (1 x)(on the unit
Michigan - STAT - 36-754
Solution to Homework #3, 36-75425 February 2006Exercise 10.1I need one last revision of the denition of a Markov operator: a linear operatoron L1 satisfying the following conditions.1. If f 0 (-a.e.), then Kf 0 (-a.e.).2. If f M (-a.e.), then Kf M (
Michigan - STAT - 36-754
Syllabus for Advanced Probability II,Stochastic Processes36-754Cosma ShaliziSpring 2006This course is an advanced treatment of interdependent random variablesand random functions, with twin emphases on extending the limit theoremsof probability fro
George Mason - STAT - 344
Introduction to Engineering StatisticsLecture 02 TopicsCollecting engineering dataMechanistic and empirical modelsProbability and probability modelsLecture 02 Reference:Montgomery: Sec 1.2 through 1.41Basic Types of StudiesThree basic methods for
George Mason - STAT - 344
Probability ALecture 03 TopicsRandom experimentsSample spacesEventsCounting techniquesLecture 03 Reference:Montgomery: Sec 2.112ProbabilityCHAPTER OUTLINE2-1 Sample Spaces &amp; Events2-1.1 Random Experiments2-1.2 Sample Spaces2-1.3 Events2-1.
George Mason - STAT - 344
Probability BLecture 04 TopicsEqually likely outcomesProbability rulesUnions, intersections &amp; complementsSet operationsConditional probabilities in treesLecture 04 Reference:Montgomery:Sec 2.2 Axioms of ProbabilitySec 2.3 Addition rulesSec 2.4
George Mason - STAT - 344
Probability CLecture 05 TopicsMultiplication ruleTotal probability ruleIndependence of eventsReliabilityBayes TheoremRandom variablesLecture 05 Reference:Montgomery:Sec 2.5Sec 2.6Sec 2.7Sec 2.8Multiplication, total probability rulesIndepend
George Mason - STAT - 344
Discrete Probability ALecture 06 TopicsDiscrete random variables, defined &amp; graphedCumulative distribution functions, defined &amp;graphedMean and variance of a discrete random variableDefined mathematicallyGraphically explainedLecture 06 Reference:M
George Mason - STAT - 344
Discrete Probability BLecture 07 TopicsFor each of these distributions, we will examine the:Graph and parametersProbability mass and cumulative distribution functionsMean and varianceUniform distributionBinomial distribution:Negative binomial dist
George Mason - STAT - 344
Discrete Probability CLecture 08 TopicsFor each of these distributions, we will examine the:Graph and parametersProbability mass and cumulative distribution functionsMean and varianceHypergeometric distributionPoisson distributionLecture 08 Refere
George Mason - STAT - 344
Probability &amp; Statistics forEngineers/Scientists ILecture 01 TopicsIntroduction to the Syllabus, Assignment SheetBlackboard for course materials, lecture notesIntroduction to the instructorBasic ideas in statisticsIllustration of computer tools RL
George Mason - STAT - 344
Continuous Probability ALecture 09 TopicsContinuous variable distribution propertiesPDF &amp; CDF functions and graphsDerivation of the mean and varianceDesign and uses of the uniform distributionLecture 09 Reference:Montgomery:Sec 4.1Sec 4.2Sec 4.3
George Mason - STAT - 344
Continuous Probability BLecture 10 TopicsNormal distribution graphs and parametersStandard normal calculation, table and softwareApproximating discrete distributions with the normalExponential distributionFormula, graphs and parameterApplicationsL
George Mason - STAT - 344
Continuous Probability CLecture 11 TopicsBuilding on the exponential distribution of prior lectureMotivation, formula, graph, parameters andapplications of the:Erlang distribution and its extension, the gamma distributionWeibull distributionLognorm
George Mason - STAT - 344
Joint Probability Distributions ALecture 12 TopicsBuilding on the exponential distribution of prior lectureMotivation, formula, graph, parameters andapplications of the:Erlang distribution and its extension, the gamma distributionWeibull distributio
George Mason - STAT - 344
Joint Probability Distributions BLecture 13 TopicsPairwise independent random variablesRectangular ranges are necessary, but not sufficientFinding these probability distributions (&gt; 2 dimensions)Joint, marginal and conditional distributionsIndepende
George Mason - STAT - 344
Joint Probability Distributions CLecture 14 TopicsDiscrete multinomial distributionContinuous bivariate normal distributionIndependentDependent (covariance &amp; correlation)Reproductive propertyLinear combinations of random variablesSums and averages
George Mason - STAT - 344
General Bivariate Continuous DistributionsThis continuous variable example illustrates1) Finding the marginal and conditional for the two variables andcorresponding expected values, variances, and standarddeviations.2) Finding general conditional dis
George Mason - STAT - 344
Bivariate Discrete DistributionsLet X and Y be two discrete random variables defined on a samplespace S of an experiment.The joint probability mass function p(x, y) is defined for each pair ofnumbers (x, y) byIn this class the pairs of numbers can be
George Mason - STAT - 344
Gamma DistributionThe gamma distribution with parameters r and can be thought of asthe waiting time for r Poisson events when r is integer. The parameteris the expected number of Poisson events per a unit time interval. Ifincrease the typical wait for
George Mason - STAT - 344
Review:MarginalandConditionalDistributionsandCovarianceforContinuousDistributionsManytopicsinthetextbeginwithgeneralcaseexamplesandthencallattentiontofamiliesofdistribution,especiallythenormalfamily.Thefollowingusesapolynomialdensityfortworandomvariabl
George Mason - STAT - 344
Midterm 2 Overview by ChapterChapter 4 Continuous distributionsFamilies: Identification, domains, expected value variance: See SummaryProbability problems:R script: Normal Distribution, Exponential Distribution, Gamma DistributionHand integration: Si
George Mason - STAT - 344
1. Probability Density Functions from Chapter 4.In the Midterm exam, some density functions will be provided. You may be asked to fill in anyof the additional information: the family names, the domain possible values, and the expectedvalue and variance
George Mason - STAT - 344
Analysis of Paired DataThe Paired t TestThe sample consists of n independently selected items for which a pairof observations is made.We can compute the difference for each pairs and make inferencesabout the mean of these differences using a one samp
George Mason - STAT - 344
Data Type, Population Parameters and R Functionsfor Hypothesis Test and Confidence IntervalsSingle Population InferenceDataParameterR functionCount or fractionProportion pbinom.testof n itemsin class of interestContinuousMean t.testPaired co
George Mason - STAT - 344
Inference about a Difference BetweenPopulation ProportionsExample problem:Olestra was a fat substitute used in some snack foods.After some people consuming such snacks reported gastrointestinalproblems an experiment was performed.Results:90 of 563
George Mason - STAT - 344
Interpreting R Hypothesis Test and Confidence Interval OutputProblems are worth .5 points each. There are 50 problems.Directions: Most answers are very short. Round many digits answers to 2 significant digits.Write neatly giving the problem number and
George Mason - STAT - 344
Interpreting R Hypothesis Test OutputIn writing numeric values for answers, round to 3 significant digits.1.Exact binomial testdata: 12 and 24number of successes = 12, number of trials = 24, p-value = 0.03139alternative hypothesis: true probability
George Mason - STAT - 344
George Mason - STAT - 344
R Inputx = c( 25.8, 36.6, 26.3, 21.8, 27.2)t.test( x, alternative=&quot;greater&quot;, mu=25, conf.level=.95)R OutputOne Sample t-testdata: xt = 1.0382, df = 4, p-value = 0.1789alternative hypothesis: true mean is greater than 2595 percent confidence interv
George Mason - STAT - 344
Concepts of Point EstimationLecture 18 (former 17) Topics Basic properties of a confidence interval Large-sample confidence intervalsPopulation mean for measurement dataPopulation proportion for categorical data Bootstrap confidence intervals ignore
George Mason - STAT - 344
Confidence IntervalsLecture 20 TopicsVariancesProportionsPrediction intervalsLecture 19 Reference:Montgomery Sections 9-1 thru 9-3Devore Lecture 20Devore Lecture 211Hypothesis and Test ProceduresLecture 20 TopicsHypothesis tests versus confide
George Mason - STAT - 344
Risks and P-ValuesLecture 21and 22 TopicsType II errors risksP-ValuesLecture 21 Reference:Montgomery Sections 9-4, 9-1Excel WSReviewedStat 344 Lecture 221 RisksGo to file: Stat 344 Lecture 21 WSconcerning the interaction of theseinterrelate
George Mason - STAT - 344
dcfeae7461006edd771c0bf8ba9d38963497f08b.xlsDr. SimsIllustration of Defined Alternative HypothesisInput DataH0: =75H1: ==n==7491000.01Output DataIntermediate Calcs7070.470.871.271.67272.472.873.273.67474.474.875.275.67676.4
George Mason - STAT - 344
Two-Sample t-test proceduresTwo-sample t-test procedures enable inference about the difference ofmeans for two populations,Samples from the two populations denoted 1 and 2 are stored invectors called x and y for convenience.The procedures make use of
George Mason - STAT - 344
Tests concerning a population mean.The mean of a random sample from a population provides afoundation for creating a test statistic to assesses hypothesis about apopulation mean.Case 1. The population is from the normal family with meanThe standard d
George Mason - STAT - 344
Tests concerning a Population ProportionBackground: Large Sample TestsCommon large sample test statistics have form Z =.is the estimator for the population parameter of interest.is the expected value under the Null Hypothesis.is standard deviation o
George Mason - STAT - 344
Quiz1Scope ThisisaclosedbookandnotesquizrelatedtoChapter1and associatedRscripts. Thescopeisgivenbelow. Hopefullymanywillgetaperfectscope. 1. BeabletousewordstodescribedensityplotsasinFigure 1.11 2. Beabletowritethedefinitionsofthemeanandmedianon page25and
George Mason - MTH - 203
(J / jS O lUlIM ath 203-001 Spring 2011E xam 1Name: L astF irst( Problem 1 ) (25 points) F ind t he g eneral so lution o f t he linear s ystem (pleasewrite t he soluti on in t he v ector form) o r e xpla in w hy t he s ystem is inconsistent .- X2