We aren't endorsed by this school 
STAT 600  University Of Michigan Study Resources
 University Of Michigan
 Unknown
 Statistical Methods for Spatial Data Analysis (Chapman & Hall/CRC Texts in Statistical Science), Assessment and Future Directions of Nonlinear Model Predictive Control (Lecture Notes in Control and Information Sciences), Control of Sandwich Nonlinear Systems, Analysis of Messy Data, Volume III: Analysis of Covariance, Analysis of Variance and Covariance: How to Choose and Construct Models for the Life Sciences

Fitcomplexity
School: University Of Michigan
q 4 q q 2 q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq Y q q q q 2 0 q 2 1 0 X 1 2

Figcov_int_slope
School: University Of Michigan
Underestimate slope, overestimate intercept  0.5 0.0 Y Overestimate slope, underestimate intercept 0.5 1.0 X 1.5 2.0 2.5

Diagnostics3
School: University Of Michigan
q q 1.5 q q q q q q q q q q q q X2 0.5 0.5 q qq q q q q q qq q qq q q q q q q q q qq q q 1.5 q 2 1 0 1 2 Studentized residuals

Ps01
School: University Of Michigan
Statistics 600 Problem Set 1 Due in class on September 25 1. Prove that the "horizontal residuals" sum to zero based on a least squares fit of Y (as the dependent variable) on X (as the independent variable). The horizontal residuals are the lengths

Exam1
School: University Of Michigan
Statistics 600 Exam 1 October 14, 2008 1. (a) Suppose we t a simple linear regression between Y and X. Our goal is to estimate the coecient in the relationship E(Y X) = + X with the least possible variance, where cov(Y X) = 2 I can be assumed to

Prediction
School: University Of Michigan
Prediction Ridge regression uses the minimizer of a penalized squared error loss function to estimate the regression coefficients: 2 ^ argmin Y  X + D. Typically D is a diagonal matrix with 0 in the 1,1 position and ones on the rest of the di

Decomposingvariance
School: University Of Michigan
Decomposing variance Pearson correlation The population Pearson correlation coecient of two jointly distributed random variables X and Y is cov(X, Y ) . X Y XY = It is estimated by XY cov(X, Y ) = = X Y X)(Yi Y ) (X X) (Y Y ) = .

Ps03
School: University Of Michigan
Statistics 600 Problem Set 3 Due in class on Tuesday, October 23rd 1. Suppose we have a bivariate regression model Y = + X + Z + where the usual assumptions E( X, Z) = 0 and cov( X, Z) I hold. In addition, for simplicity, assume that EX = EZ = 0

Ps07
School: University Of Michigan
Statistics 600 Problem Set 7 Due in class on November 14th 1. What is the relationship between measurement error and confounding? In the notation of the confounding course notes, let Z be the true covariate level, and let X be the covariate as observ

Ps02
School: University Of Michigan
Statistics 600 Problem Set 2 Due in class on September 26 1. In this problem you will use simulation to better understand the sampling variation of when two covariates are strongly dependent. Suppose we can partition the design matrix as X = [1 U V

Syllabus
School: University Of Michigan
Statistics 600 Applied statistics and data analysis I Instructor: Oce: Email: Oce hours: Course web page: Kerby Shedden 461 West Hall kshedden@umich.edu Monday 34, Wednesday 1112 www.stat.lsa.umich.edu/kshedden/Courses/Stat600 Course description:

Leastsquares
School: University Of Michigan
Least Squares Fitting and Inference Kerby Shedden August, 2008 Denitions and Motivation Independent variables (predictors, regressors, covariates): X = (X1 , . . . , Xp ) Dependent variable (response, outcome): Y The goal is to learn about an unkno

Ps02
School: University Of Michigan
Statistics 600 Problem set 2 Due in class on October 2nd. 1. Suppose A and B are symmetric matrices, and A, B, and A + B are all idempotent. Show that AB 0. Hint: use the Schur decomposition for a symmetric matrix S, S = V DV , where D is diagonal a