This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: 1 Linear Regression Models - review The response (dependent) variable Y is continuous The explanatory variables X 1 , …, X p are continuous, discrete or categorical The error terms are Normally distributed Simple Linear Regression Model: Y = β + β 1 X + ε ε is the random error term assumed to be i i d Normal with intercept slope – ε is the random error term assumed to be i.i.d. Normal with mean 0 and variance σ 2 – “linear” means that the equation is linear in the parameters β and β 1 (not that Y and X are necessarily linearly related) – Y is a random variable – X is assumed to be fixed and known (no measurement error nor random) Example of Linear in the Parameters Y = # eggs laid, X = body weight of a croaker Hypothesis : larger animals produce more eggs but Hypothesis : larger animals produce more eggs but the relationship is logarithmic i.e. ln(Y) = α + β ln(X) + ε ≈ Y = e α X β e ε ≈ Y = X β e α + ε Note: a log-log model implies that Y is a power function of X and the error has a multiplicative rather than additive effect. 2 Parameter Estimation The parameters, β , β 1 , and σ 2, are estimated using least squares to obtain estimators b , b 1 , and MSE respectively i i n i i i n i i n i i i x b b y n y y n SSE MSE x b y b x x y y x x b 1 1 2 1 1 2 1 1 ˆ , 2 ) ˆ ( 2 , ) ( ) )( ( + = − − = − = − = − − − = ∑ ∑ ∑ = = = Parameter Estimation Assume the ε are normally distributed and X is fixed, then – Y is Normal( β + β 1 X, σ 2 ) – The estimators b and b 1 are also normally distributed....
View Full Document
This note was uploaded on 07/22/2011 for the course STA 4702 taught by Professor Staff during the Spring '08 term at University of Florida.
- Spring '08
- Linear Regression