Lecture 6: Multiple linear regression
c Christopher S. Bretherton
Winter 2014
A natural extension is to regress a predictand y on multiple predictor variables xm . Assuming all variables have been de-meaned, we t the linear
model:
y = a1 x1 + a2 x2 . . .
Lecture 15: Spectral Filtering
c Christopher S. Bretherton
Winter 2014
Refs: Matlab Signal Processing Toolbox help; Hartmann notes, Chapter 7.
15.1
Introduction
Filtering a time series means removal of the spectral power at some chosen
frequencies while r
Lecture 13: Windowed Spectral Analysis on Nino
SSTA
c Christopher S. Bretherton
Winter 2014
13.1
Implementing the pieces in Matlab
Script nino3.m applies a set of Hann windows of length Nw = 240 (20 years)
to the SSTA dataset and calculates the DFT power
Lecture 12: Windowing and Tapering for Robust
Spectral Estimation
c Christopher S. Bretherton
Winter 2014
References:
Hartmann Atm S 552 notes, Chapter 6.1-2.
Percival, D. B., and A. T. Walden : Spectral Analysis for Physical Applications, Cambridge Univ.
Lecture 10: Lagged autocovariance and
correlation
c Christopher S. Bretherton
Winter 2014
Reference: Hartmann Atm S 552 notes, Chapter 6.1-2.
10.1
Lagged autocovariance and autocorrelation
The lag-p autocovariance is dened
N
ap =
uj uj +p /N,
(10.1.1)
j =
Lecture 8: Properties of the DFT
c Christopher S. Bretherton
Winter 2014
8.1
Aliasing
Because the DFT is based on sampling a continuous function at a nite set
of equally-spaced points j t, many dierent L-periodic functions can have the
same DFT. In fact,
Lecture 9: DFT data analysis, power spectrum
c Christopher S. Bretherton
Winter 2014
9.1
Application of DFT to a Gaussian function
Before diving into use of the DFT for data analysis, we consider the DFT of a
familiar function, the Gaussian
u(x) = exp(x2
Lecture 11: White and red noise
c Christopher S. Bretherton
Winter 2014
Reference: Hartmann Atm S 552 notes, Chapter 6.1-2.
11.1
White noise
A common way to statistically assess the signicance of a broad spectral peak
as in the Nino3.4 example is to compa
Lecture 7: The Complex Fourier Transform and
the Discrete Fourier Transform (DFT)
c Christopher S. Bretherton
Winter 2014
7.1
Fourier analysis and ltering
Many data analysis problems involve characterizing data sampled on a regular
grid of points, e. g. a
Lecture 3: Statistical sampling uncertainty
c Christopher S. Bretherton
Winter 2014
3.1
Central limit theorem (CLT)
Let X1 , ., XN be a sequence of N independent identically-distributed (IID)
random variables each with mean and variance 2 . Then
X1 + X2 .
Lecture 5: Linear regression with one predictor
c Christopher S. Bretherton
Winter 2014
Ref: Hartmann Ch. 3
Suppose we have N measurements of one predictor variable xj (e. g. car
weight) and corresponding measurements of a predictand variable yj (e. g. mi
Lecture 2: Probability and Statistics (continued)
c Christopher S. Bretherton
Winter 2014
2.1
Expectation and moments
Expectation of a function g (X ) of a RV X is
E [g (X )]
=
g (x)p(x)dx
discrete RV X
x:p(x)>0
E [g (X )]
=
g (x)f (x)dx
continuous RV X
T
Lecture 4: Statistical signicance
c Christopher S. Bretherton
Winter 2014
For many purposes, it suces to estimate uncertainty in a data analysis.
However, if some signal is marginally detected, when should we decide that it
is signicant ? For instance, su
Amath 482/582 Lecture 1
Course Introduction and Review of Probability
c Christopher S. Bretherton
Winter 2014
1.1
Introduction
Course goal: Students will learn a computational toolbox to analyze and explore
large datasets coming from diverse applications,
Lecture 14: Frequency-Time Analysis of Sounds
using a Windowed Fourier Method
c Christopher S. Bretherton
Winter 2014
14.1
Character of music and speech
Speech and music involve producing a certain sound (set of frequencies in a
given proportion) for some