Chapter 2 Notes

3 response variable y and predictors x1 x2 xp model

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: equency 4 10 0 2 5 0 Frequency 8 15 12 Histograms of residuals for Model1 and Model 2. -1 0 1 r1 2 3 -1 0 1 2 r2 UNM General principles (sec 2.3) Response variable Y and predictors X1 , X2 , . . . , Xp . Model building, Specify (parametric) probability distribution of Y (Normal, Poisson, etc.) ”Link” E (Y ) to predictors X1 , X2 , . . . , Xp . g (E (Y )) = β0 + β1 X1 + . . . + βp Xp a function of the mean of Y is a ”linear component”. Parameter estimation: MLE, least squares, Bayes. Model checking: consider model residuals. UNM In Linear regression we use standardized residuals ri = ˆ (Yi − Yi ) . σ ˆ ˆ where Yi is a fitted value and σ estimates the error SD. ˆ Yi ∼ Poisson(θ); i = 1, 2, . . . , n ri = ˆ (Yi − θ) ˆ θ Square root contribution to a Pearson goodness of fit statistic: (Oi − ei )2 /ei i where Oi represents an observed value and ei an expected value. Exponential family of distributions. UNM...
View Full Document

Ask a homework question - tutors are online