Week 8 - Multiple Regression and Reliability

# Week 8 - Multiple Regression and Reliability - Engr9397...

This preview shows pages 1–9. Sign up to view the full content.

Engr 9397 – Week 8 Reliability

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Multiple Regression In multiple linear regression, the dependent y variable continues to assume random values, but we are dealing with k fixed independent (non random) variables (regressors) In the case of simple regression, we used the least squares method to find the line of best fit to the data In the case of multiple regression, where more than two variables are involved, we apply the method of least squares to fit a hyperplane to a set of data points so that y can be predicted for a given set of values of x 1 , x 2 ,…x k with minimum error
Multiple Regression continued The fitted surface has the equation which predicts the value of y when the independent x variables take on a given set of values The b k coefficients can be found by the least squares method (complicated without help of software) k k x b x b x b b y ... ˆ 2 2 1 1 0

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Multiple Regression Let’s look at a sample Minitab result and go through what it means Example: 5 variables (conveyor angle, temperature, flux concentration, conveyor speed, and preheat temperature) involved in the soldering process of a circuit board set up are measured over 25 separate runs of 5 boards. Each board contains 460 solder joints. We are interested in examining the effects of these 5 variables on the number of defective solder joints per 100 joints inspected is recorded
Multiple Regression Analysis We get the following results from a linear regression analysis 5 4 3 2 1 6 000169 . 0 122 . 0 9 . 0 0096 . 0 214 . 0 79 . 1 C C C C C C COEFFICIENT STD DEV COEF T RATIO COEF/STD DEV 1.7885 0.9655 1.85 C1 .21357 0.0363 5.88 C2 .00959 0.001873 0.51 C3 .898 1.047 0.86 C4 .1216 0.2167 0.56 C5 .0001695 0.0009457 0.18 R squared 73.6 S 0.05806 Row C1 C6 Pred Y value Std Dev Residual St.Res. 22 6.7 0.287 0.1104 0.022 0.1766 3.29R

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Regression analysis result interpretation Regression line equation: a fitted linear least squares equation, gives coefficient values Table of coefficients: gives coefficient values, t values needed to perform a significance test on each coefficient coefficient t values can be compared to t crit at a given level of significance and d.f. s : the standard deviation of the residuals estimated by the mean squared deviation of the y values from the least squares equation
R Squared value Similar to r , R is the multiple correlation coefficient R 2 measures the strength of the linear relationship of the x’s with the dependent variable, y , by comparing the variance of the y values with the variance of the residuals R 2 is expressed as a percentage of reduction of y variance that is attributable to the relationship between y and the predictor variables ( x i )

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Correlation among x variables Coefficients are subject to random error, error that can be quantified by a CI (assuming that residuals are normally distributed)
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 03/29/2011 for the course ENGR 9397 taught by Professor Susanhunt during the Winter '11 term at Memorial University.

### Page1 / 29

Week 8 - Multiple Regression and Reliability - Engr9397...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online