• 1 Page fit-complexity
    Fit-complexity

    School: University Of Michigan

    q 4 q q 2 q q q q q q qq q q q q q q q qq q q q q q q q q q q q q q q q q q q q q q q q qq Y q q q q -2 0 q -2 -1 0 X 1 2

  • 1 Page fig-cov_int_slope
    Fig-cov_int_slope

    School: University Of Michigan

    Underestimate slope, overestimate intercept - 0.5 0.0 Y Overestimate slope, underestimate intercept 0.5 1.0 X 1.5 2.0 2.5

  • 1 Page fig-convexity
    Fig-convexity

    School: University Of Michigan

    =1 Q(v) Q(v)+(1-)Q(w) =0 Q(w) Q(v+(1-)w) v w

  • 1 Page diagnostics-3
    Diagnostics-3

    School: University Of Michigan

    q q 1.5 q q q q q q q q q q q q X2 0.5 -0.5 q qq q q q q q qq q qq q q q q q q q q qq q q -1.5 q -2 -1 0 1 2 Studentized residuals

  • 4 Pages ps01
    Ps01

    School: University Of Michigan

    Statistics 600 Problem Set 1 Due in class on September 25 1. Prove that the "horizontal residuals" sum to zero based on a least squares fit of Y (as the dependent variable) on X (as the independent variable). The horizontal residuals are the lengths

  • 5 Pages exam1
    Exam1

    School: University Of Michigan

    Statistics 600 Exam 1 October 14, 2008 1. (a) Suppose we t a simple linear regression between Y and X. Our goal is to estimate the coecient in the relationship E(Y |X) = + X with the least possible variance, where cov(Y |X) = 2 I can be assumed to

  • 13 Pages prediction
    Prediction

    School: University Of Michigan

    Prediction Ridge regression uses the minimizer of a penalized squared error loss function to estimate the regression coefficients: 2 ^ argmin Y - X + D. Typically D is a diagonal matrix with 0 in the 1,1 position and ones on the rest of the di

  • 39 Pages decomposing-variance
    Decomposing-variance

    School: University Of Michigan

    Decomposing variance Pearson correlation The population Pearson correlation coecient of two jointly distributed random variables X and Y is cov(X, Y ) . X Y XY = It is estimated by XY cov(X, Y ) = = X Y X)(Yi Y ) (X X) (Y Y ) = .

  • 7 Pages ps03
    Ps03

    School: University Of Michigan

    Statistics 600 Problem Set 3 Due in class on Tuesday, October 23rd 1. Suppose we have a bivariate regression model Y = + X + Z + where the usual assumptions E( |X, Z) = 0 and cov( |X, Z) I hold. In addition, for simplicity, assume that EX = EZ = 0

  • 10 Pages ps07
    Ps07

    School: University Of Michigan

    Statistics 600 Problem Set 7 Due in class on November 14th 1. What is the relationship between measurement error and confounding? In the notation of the confounding course notes, let Z be the true covariate level, and let X be the covariate as observ

  • 9 Pages ps02
    Ps02

    School: University Of Michigan

    Statistics 600 Problem Set 2 Due in class on September 26 1. In this problem you will use simulation to better understand the sampling variation of when two covariates are strongly dependent. Suppose we can partition the design matrix as X = [1 U V

  • 1 Page syllabus
    Syllabus

    School: University Of Michigan

    Statistics 600 Applied statistics and data analysis I Instructor: Oce: E-mail: Oce hours: Course web page: Kerby Shedden 461 West Hall kshedden@umich.edu Monday 3-4, Wednesday 11-12 www.stat.lsa.umich.edu/kshedden/Courses/Stat600 Course description:

  • 132 Pages least-squares
    Least-squares

    School: University Of Michigan

    Least Squares Fitting and Inference Kerby Shedden August, 2008 Denitions and Motivation Independent variables (predictors, regressors, covariates): X = (X1 , . . . , Xp ) Dependent variable (response, outcome): Y The goal is to learn about an unkno

  • 11 Pages ps02
    Ps02

    School: University Of Michigan

    Statistics 600 Problem set 2 Due in class on October 2nd. 1. Suppose A and B are symmetric matrices, and A, B, and A + B are all idempotent. Show that AB 0. Hint: use the Schur decomposition for a symmetric matrix S, S = V DV , where D is diagonal a

Back to course listings