This preview shows pages 1–22. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 1 Introduction To Empirical Models1 Introduction To Empirical Models1 Introduction To Empirical Models1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is related to x by the following straightline relationship: where the slope and intercept of the line are called regression coefficients . The simple linear regression model is given by where ε is the random error term. We think of the regression model as an empirical model. Suppose that the mean and variance of ε are 0 and σ 2 , respectively, then The variance of Y given x is1 Introduction To Empirical Models • The true regression model is a line of mean values: where β 1 can be interpreted as the change in the mean of Y for a unit change in x . • Also, the variability of Y at a particular value of x is determined by the error variance, σ 2 . • This implies there is a distribution of Yvalues at each x and that the variance of this distribution is the same at each x .1 Introduction To Empirical Models1 Introduction To Empirical Models1 Introduction To Empirical Models A Multiple Regression Model:1 Introduction To Empirical Models1 Introduction To Empirical Models2 Simple Linear Regression 62.1 Least Squares Estimation • The case of simple linear regression considers a single regressor or predictor x and a dependent or response variable Y . • The expected value of Y at each level of x is a random variable: • We assume that each observation, Y , can be described by the model2 Simple Linear Regression 62.1 Least Squares Estimation • Suppose that we have n pairs of observations ( x 1 , y 1 ), (x 2 , y 2 ), …, ( x n , y n ). • The method of least squares is used to estimate the parameters, β 0 and β 1 by minimizing the sum of the squares of the vertical deviations in Figure 66.2 Simple Linear Regression 62.1 Least Squares Estimation • Using Equation 68, t he n observations in the sample can be expressed as • The sum of the squares of the deviations of the observations from the true regression line is2 Simple Linear Regression 62.1 Least Squares Estimation2 Simple Linear Regression 62.1 Least Squares Estimation2 Simple Linear Regression 62.1 Least Squares Estimation2 Simple Linear Regression 62.1 Least Squares Estimation2 Simple Linear Regression 62.1 Least Squares Estimation62....
View
Full
Document
This note was uploaded on 12/25/2010 for the course ALL 0204 taught by Professor 79979 during the Spring '10 term at National Chiao Tung University.
 Spring '10
 79979

Click to edit the document details