Page 746 -
P. 746

17.5    Weighted Least Squares	 745

	 17.9	 Prove Equation (17.16) under Assumptions #1 and #2 of Key Concept 17.1
                                  plus the assumption that Xi and ui have eight moments.

	 17.10	Let un be an estimator of the parameter u, where un might be biased. Show
                                  that if E3(un - u)24 ¡ 0 as n ¡ ∞ (that is, the mean squared error
                                  of un tends to zero), then un ¡p u. [Hint: Use Equation (17.43) with
                                  W = un - u.4

	17.11	 Suppose that X and Y are distributed bivariate normal with density given
                                  in Equation (17.38).

	 a.	 Show that the density of Y given X = x can be written as

			    fY͉X = x(y)        =                           1 exp c  -  1y      - mY͉X 2
                                                 sY͉X 22p         2a      sY͉X b d

		where sYX = 2sY2 (1 - r2XY) and mY͉X = mY - (sXY>sX2 )(x - mX).
                                       [Hint: Use the definition of the conditional probability density

                                       fY 0 X = x(y) = 3gX, Y(x, y)4 > 3fX(x)4, where gX,Y is the joint density of X
                                       and Y, and ƒX is the marginal density of X.]

	 b.	 Use the result in (a) to show that Y 0 X = x N(mY 0 X, sY2 0 X).

	 c.	 Use the result in (b) to show that E(Y 0 X = x) = a + bx for suitably

                                       chosen constants a and b.

	 17.12	 a.  Suppose that u N(0, s2u). Show that E(eu) = e12su2
	 b.	 Suppose that the conditional distribution of u given X = x is

                                       N(0, a + bx2), where a and b are positive constants. Show that

                                  E(eu 0 X = x) = e12(a + bx2).

	 17.13	 Consider the heterogeneous regression model Yi = b0i + b1iXi + ui, where
                                  b0i and b1i are random variables that differ from one observation to the next.

                              Suppose that E(ui 0 Xi) = 0 and (b0i, b1i) are distributed independently of Xi.

	  a.	Let bn1OLS denote   ¡thpe OEL(Sb1e)s,tiwmhaetroerEo(fbb11)  given   in Equation (17.2).
       Show that bnO1 LS                                          is the  average value of b1i
                                                                                                       in

   the population. [Hint: See Equation (13.10).]

	 b.	 Suppose that var(ui 0 Xi) = u0 + u1X2i , where u0 and u1 are known posi-
   tDivoeecsobnn1WsLtaSn¡ts.p LeEt b(nbW1 1L)S?
                                                 denote the  weighted  least  squares  estimator.
                                                 Explain.

	17.14	Suppose that Yi, i = 1, 2, c, n, are i.i.d. with E(Yi) = m, var(Yi) = s2,
                                  and finite fourth moment. Show the following:
   741   742   743   744   745   746   747   748   749   750   751