# Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )

• Homework Help
• PresidentHackerCaribou10582
• 2

This preview shows page 1 - 2 out of 2 pages.

. We also assume that the gross saleYiofi-th sales person after the training to follownormal, i.e.XiN(μa, σ2a).Inference setup(10 pts): the null hypothesis of interest would beH0:μa=μbvs.H1:μa> μb. The appropriate test statistic would be the one samplet-statistic (this is not a two samplet-test sinceXiandYiare paired and correlated). Note that the sales differenceWi=Xi-YiN(μa-μb, σ2)for someσ(figureout the relationship betweenσ, σaandσb). So thet-statistic isT=¯W/(S/6),whereSis the sample variance ofWand it is distributed asTt5.Actual computation(10 pts). You need to computeSby yourself and actual R outputeither inptorqtwill be given.3. Consider a linear modelY=β0+β1x+². There is a reason to believe thatβ1= 0. Following sample data are given:(x1, y1),· · ·,(xn, yn). We assume²to follow zero mean normal distribution.(a) Estimateβ0by minimizing the sum of the residuals. No point given for not showing the minimization steps(10pts).(b) Determine if the estimator in (a) is an unbiased maximum likelihood estimator. Please derive everything ( 20pts).(c) Whenn= 2, show that the estimator in (a) is the minimum variance unbiased estimator (MVUE) among theestimators of the formc1Y1+c2Y2(10pts).Solution.This problem tests your total understanding of linear model and estimation. Sinceβ1= 0, we are dealingwith a little dumber version of linear modelY=β0+². (a) The the sum of squared residualsSSE=ni=1(yi-β0)2(3pts). Solving∂SSE/∂β0= 0, we getβ0= ¯y(7pts). (b) The estimator isˆβ0=¯Y(3pts). Do not mix up littleyiand bigYi. It will cost you 2 points. Then it can be shown thatEˆβ0=β0(see Lecture 20 derivations) (7pts). Weneed to check ifˆβ0is MLE. Note thatYiN(β0, σ2)for someσ
• • • 