1.4 Measures of Dependence15Example 1.12 Mean Function of a Random Walk with DriftConsider the random walk with drift model given in (1.4),xt=δt+t∑j=1wj,t=1,2,. . . .BecauseE(wt) =0for allt, andδis a constant, we haveμxt=E(xt) =δt+t∑j=1E(wj) =δtwhich is a straight line with slopeδ. A realization of a random walk with driftcan be compared to its mean function inFigure 1.9.Example 1.13 Mean Function of Signal Plus NoiseA great many practical applications depend on assuming the observed data havebeen generated by a fixed signal waveform superimposed on a zero-mean noiseprocess, leading to an additive signal model of the form (1.5). It is clear,because the signal in (1.5) is a fixed function of time, we will haveμxt=E2 cos(2πt+1550) +wt=2 cos(2πt+1550) +E(wt)=2 cos(2πt+1550),and the mean function is just the cosine wave.The mean function describes only the marginal behavior of a time series. Thelack of independence between two adjacent valuesxsandxtcan be assessednumerically, as in classical statistics, using the notions of covariance andcorrelation. Assuming the variance ofxtis finite, we have the followingdefinition.Definition 1.2Theautocovariance functionis defined as the second momentproductγx(s,t) =cov(xs,xt) =E[(xs-μs)(xt-μt)],(1.8)for allsandt. When no possible confusion exists about which time series we arereferring to, we will drop the subscript and writeγx(s,t)asγ(s,t).Note thatγx(s,t) =γx(t,s)for all time pointssandt. The autocovariancemeasures thelineardependence between two points on the same series observedat different times. Recall from classical statistics that ifγx(s,t) =0, thenxsandxtare not linearly related, but there still may be some dependence structurebetween them. If, however,xsandxtare bivariate normal,γx(s,t) =0ensurestheir independence. It is clear that, fors=t, the autocovariance reduces to the(assumed finite) variance, becauseγx(t,t) =E[(xt-μt)2] =var(xt).(1.9)