l02 - Review of Multiple Regression Assumptions about prior...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Review of Multiple Regression — Page 1 Review of Multiple Regression Assumptions about prior knowledge. This handout attempts to summarize and synthesize the basics of Multiple Regression that should have been learned in an earlier statistics course. It is therefore assumed that most of this material is indeed “review” for the reader. (Don’t worry too much if some items aren’t review; I know that different instructors cover different things, and many of these topics will be covered again as we go through the semester.) Those wanting more detail and worked examples should look at my course notes for Grad Stats I. Basic concepts such as means, standard deviations, correlations, expectations, probability, and probability distributions are not reviewed. In general, I present formulas either because I think they are useful to know, or because I think they help illustrate key substantive points. For many people, formulas can help to make the underlying concepts clearer; if you aren’t one of them you will probably still be ok. Linear regression model j j k i j ij i j kj k j j j X Y E X X X X Y ) | ( ... 1 2 2 1 1 i partial slope coefficient (also called partial regression coefficient, metric coefficient). It represents the change in E(Y) associated with a one-unit increase in X i when all other IVs are held constant. the intercept. Geometrically, it represents the value of E(Y) where the regression surface (or plane) crosses the Y axis. Substantively, it is the expected value of Y when all the IVs equal 0. the deviation of the value Y j from the mean value of the distribution given X. This error term may be conceived as representing (1) the effects on Y of variables not explicitly included in the equation, and (2) a residual random element in the dependent variable. Parameter estimation (Metric Coefficients): In most situations, we are not in a position to determine the population parameters directly. Instead, we must estimate their values from a finite sample from the population. The sample regression model is written as j j k i j ij i j kj k j j j e Y e X b a e X b X b X b a Y ˆ ... 1 2 2 1 1 where a is the sample estimate of and b k is the sample estimate of k .
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Review of Multiple Regression — Page 2 Computation of b k Case Formula(s) Comments All Cases   1 '' X X X Y  This is the general formula but it requires knowledge of matrix algebra to understand that I won’t assume you have. 1 IV case 2 x xy s s b Sample covariance of X and Y divided by the variance of X Computation of a (all cases) a y b x k k k K 1 Compute the betas first. Then multiply each beta times the mean of the corresponding X variable and sum the results. Subtract from the mean of y.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/29/2012 for the course SOC 63993 taught by Professor Richardwilliams during the Spring '11 term at Notre Dame.

Page1 / 12

l02 - Review of Multiple Regression Assumptions about prior...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online