This** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
This** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
This** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
**Unformatted text preview: **l. (10 pts.) The December 2003 issue of Kiplinger’s Personal Finance contained information on a number
of variables for 428 model year 2004 vehicles. Here is a scatterplot of the highway miles per gallon (MPG)
against Reciprocal Weight ( l/tons) for these vehicles: Highway Mile/Gallon 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1
Reciprdcal Weight (1/tons) Furthermore, here is some summary information on» these two variables: Variable Mean ‘ StDev
MPG 26.906 5.697
Reciprocal Weight 0.58343 0.12053 / .
Correlation of MPG and Reciprocal Weight 8 0.821 f I. Give the least squares equation for MPG in terms of Reciprocal Weight. Make sure you write your
equation in terms of MPG and Reciprocal Weight (and not, say, in terms of generic ‘x’ and ‘y’). 5.697
0.12033 (WP/wan, WI
.ff3f3) /\
Mp5“.— 26,10A 4- 0.92! @ b. The Honda Civic EX, one of the 428 cars in the dataset, gets 44 Highway MPG and has a Reciprocal
Weight of 0.8000 l/tons. Using part a, ﬁndthepredicted MPG fortheHonda Civic HX.
fund—(Ne. “d P-Eeu'lwqu mayor: 0.5000 , we. at?“
A ~
Mfd— : 353) Mas/740:». @ Find the residual for the Honda Civic HX. 713mm: Aémnru- now = 4+»353I = 5.67 mike/7d“ 2. (35 pts.) Here is a plot of the (average) January Temperature (in degrees Fahrenheit) against Latitude for
56 US. Cities‘ over the time period 1931-1960: 8388 January Temperature
M
O .0
0’ O i .
Minheap'olis
. O .5
0° 3O 40 - 50
Latitude 80, at first glance, there appears to be a linear relationship between January Temperature and Latitude.
Here is a portion of the Minitab output for ﬁtting the least-squares line: The regression equation is
January Temperature = 109 - 2.11 Latitude Predictor Coef StDev T P
Constant 108.728 7.056 15.41 0.000
Latitude —2.1096 0.1794 -11.76 0.000
S = 7.156 R-Sq = 71.9% R-Sq(adj) = 71.4% ' Rapid City has an approximate latitude of 44.08 and, during the time period in question, had an average January
temperature of 23.2 degrees Fahrenheit according to the booklet Rapid City Climate, 1986, by James Miller. a. How large an error, typically, is there between the actual January Tenure-ram and the predicted January Because the associated residual plot showed a hint of a parabolic (i.e. quadratic) pattern, a data analyst
decided to ﬁt January Temperature as quadratic in Latitude \(i.e. the analyst added the variable Latitude”). Here is the resulting ﬁt by Minimb: The regression equation is
January Temperature = 204 - 7.28 Latitude + 0.0688 Latitude Squared Predictor Coef StDev '1' P
Constant 203.62 36.16 5.63 0.000
Latitude —7.278 1.943 -3.75 0.000
Latitude Squared 0.06885 0.02578 2.67 0.010
S = 6.781 R~Sq = 75.2% R—Sq(adj) - 74.3% b. In this particular case the R1 value increased from the previous model (from 71.9% to 75.2%). Is this
always the case when we add an additional variable to a model? (Just answer ‘Yes’ or ‘No’.) c. In this parn'cular case the adjusted R2 value increased (from 71.4% to 74.3%). Is this always the case
when we add an additional variable to a model? (Just answer ‘Yes' or ‘No’.) d. On the basis of the standard error only, which model should we choose to use (circle one)? i. Janna Ternperature is linearinLatitude /
.W (/W se. [s W) e. On the basis of adjusted R2 9512, which model should we choose to use (circle one)? f. The last row of the Minimb output above, in bold, concerns the hypothesis test H0 :,132 = 0
H 4 :13, at 0
in the regression model January Temperature, = ,6", + ,6, (Latttude,) + ,8, (Latitude?) + s, (with the usual assumptions about the errors). Sketch - at the top of the next page -— the t curve for this line of output showing how the values -2.67, 2.67, and 0.010 are related. Also on your sketch, indicate the
“degrees of freedom" for your t curve. '65} H 'K -—Z.57 'D 2.57 .4sz
3. For the hypothesis test P“ ‘
Ho 1,6, :0 "b
H‘ :,62 #0 we had a p-value of 0.010. Do we accept the null hypothesis or reject the null hypothesis for an a of 0.05? ”'5 p—wluc- < K- 3. (15 pts.) Two objects of unknown weights wI and w; are weighed on an error-prone pan balance in the
following way: (1) Object 1 is weighed by itself and the measurement is 10 grams,
(2) object 2 is weighed by itself and the measurement is 5 grams,
(3) The difference of the weights (the weight of object 1 minus the weight of object 2) is measured by placing the objects in
different pans, and the result is 4 grams, (4) The sum of the weights is measured as 11 grams Find the least squares estimates of WI and w; using the matrix expression ,5” = (X 'X)" X ' y. [0 (Do-.1,
”3" 5 =012}=2f modal: ‘02. + I-( N L/ =5Z-3=§3 4. (15 pts.) Suppose 5. (13 pts.) Suppose we would like to check if the 4 values 52.4, 43.4, 58.9, 46.1 could have reasonably
come from some normal. Give me the 4 ordered pairs (x,y) which are plotted to produce the normal 5' . probability plot. , n: 1 0m rams
DMD— 43fl 5%.: £1. 4 {8.7
IMMW. is“ art?“ 60* 304" i/CMAI) 55V) Amman, 9‘84— 1‘25 ,25 .81\ 47% + Palm; War
Wu? 7745-
MMHW ow;— 6. (12 pts.) Multiple Choice (grgle the single best response in each case, M reasoning necessary): s. Using a least-squares model for predicting a response variable for values of the predictor variables
outsidethedonninusedtohuildthemodelcanlesdto 63W ii. over-ﬁtting the model
iii. linear association b. The correlation coefﬁcient is a measure of
i. extrapolation error ,
ii. over-tiassoc u . the model
as c. When a least-squares model ﬁts the data collected well but otherwise seems to have poor predictive
powerthismsybeaconsequence of i. the extrapolation phenomena 111. linear association d. The least squares procedure 1. minimizes the sum of residuals
@ ’ I W.Wr!.mmm-n~
iii maximizes the sum of squared residuals
iv. minimizes the sum of absolute residuals
9 (AB)' = B‘ A' is
it sometimes true I
m never true / f. A3281! is ...

View
Full Document

- Spring '04
- JOHNSON
- Math, Statistics, Probability