Lecture Notes Ch 3 MLR (2).pdf - Chapter 3 The Multiple Regression Model Data used in the chapter We will use the sample of the data for the following

# Lecture Notes Ch 3 MLR (2).pdf - Chapter 3 The Multiple...

This preview shows page 1 - 4 out of 34 pages.

1 Chapter 3The Multiple Regression Model Data used in the chapter We will use the sample of the data for the following business for this session. A hamburger chain store, Big Andy’s Burger Barn, wants to decide their pricing policy for different products and how much they spend on advertising. To assess the effect of the price structures and different levels of advertising expenditure, the firm sets different prices and spends varying amounts on advertising in different cities. The management is intersted in: 1) how sales revenue changes as the level of advertising changes? And 2) will reducing prices lead to decrease in revenue? Table 1 Observations on Monthly Sales, Price, and Advertising in Big Andy’s Burger Barn3.1 Introduction 3.1.1 The General Model In a general multiple regression model, a dependent variable y is related to a number of explanatory variables x2, x3, …, xKthrough a linear equation that can be written as: 12233ββββKKyxxxe(3.1) A single parameter, call it βk, measures the effect of a change in the variable xkupon the expected value of y, all other variables held constant other xs held constantβkkkE yE yxx 2 The parameter β1is the intercept term, which we can think of it as being attached to a variable x1that is always equal to 1, that is, x1= 1. In this chapter, we focus on K=3, 12233βββyxxe(3.2) The Assumptions of the Model MR1.122,1,,iiKiKiyxxeiN MR2. 122()()0iiKiKiE yxxE e MR3. 2var()var()iiye MR4. cov(,)cov(,)0ijijyye eMR5. The values of each xtkare not random and are not exact linear functions of the other explanatory variables. MR6. 22122~(),~(0,)iiKiKiyNxxeN Specifically, the statistical properties of yfollow from those of e. We also make two assumptions about the explanatory variables: 1. The explanatory variables are not random variables. We are assuming that the values of the explanatory variables are known to us prior to our observing the values of the dependent variable 2. Any one of the explanatory variables is not an exact linear function of the others. This assumption is equivalent to assuming that no variable is redundant. If this assumption is violated a condition called exact collinearity - the least squares procedure fails. When we have (K-1) explanatory variables, we assume it is impossible to write 1221,1121for some values of ,, ... .ikiKi KKxcc xcxc ccFor example, using the Hamburger Chain Data, we can set up the following model 123βββ SALESE SALESePRICEADVERTe(3.3) FIGURE 1 The multiple regression plane 3 3.2. Estimating the parameters of the multiple regression model We will discuss estimation in the context of the model in (4), which we repeat here for  #### You've reached the end of your free preview.

Want to read all 34 pages?

• Fall '19
• • •  