Unformatted text preview: 4. Straight-line regression through the origin:
In this question we shall make the following assumptions:
(1) Y is related to x by the simple linear regression model Y, = Ar, e, (i = 1,2....."),
i.e., E(Y | X = x,) = Ax,
(2) The errors e, em.., e, are independent of each other
(3) The errors e, e,..., e, have a common variance "
(4) The errors are normally distributed with a mean of 0 and variance of (espe-
cially when the sample size is small), i.e., e | X-N(0.o?)
In addition, since the regression model is conditional on X we can assume that
the values of the predictor variable, A,, do ..., x, are known fixed constants.
(a) Show that the least squares estimate of P is given by
(b) Under the above assumptions show that
E(P X) =B
(i) Var(B] X) =-
View Full Document