Page 745 -
P. 745

744	 Chapter 17  The Theory of Linear Regression with One Regressor

	 b.	 Use the central limit theorem, the law of large numbers, and Slutsky’s
                                       theorem to show that the final term in the equation converges in
                                       probability to zero.

	 c.	 Use the Cauchy–Schwarz inequality and the third least squares

assumption in Key Concept 17.1 to prove that var(vi) 6 ∞ . Does the
      2n1     n
term       g  i=  1  vi  >  sv  satisfy  the  central                limit  theorem?

	 d.	 Apply the central limit theorem and Slutsky’s theorem to obtain the

result in Equation (17.12).

	 17.4	 Show the following results:

	 a.	 Show that 2n(bn1 - b1) ¡d N(0, a2), where a2 is a constant, implies
                                       that bn1 is consistent. (Hint: Use Slutsky’s theorem.)

	 b.	 Show that su2 >su2 ¡p 1 implies that su>su ¡p 1.

	 17.5	 Suppose that W is a random variable with E(W4) 6 ∞ . Show that
                                  E(W2) 6 ∞ .

	 17.6	 Show that if bn1 is conditionally unbiased, then it is unbiased; that is, show

                              that if E(bn1 0 X1, c, Xn) = b1, then E(bn1) = b1.

	 17.7	 Suppose that X and u are continuous random variables and (Xi, ui), i =
                                  1, c, n, are i.i.d.

	 a.	 Show that the joint probability density function (p.d.f.) of (ui, uj, Xi, Xj)
                                       can be written as f(ui, Xi)f(uj, Xj) for i j, where f(ui, Xi) is the joint
                                       p.d.f. of ui and Xi.

	 b.	 Show that E(uiuj 0 Xi, Xj) = E(ui 0 Xi) E(uj 0 Xj) for i j.
	 c.	 Show that E(ui 0 X1, c, Xn) = E(ui 0 Xi).
	 d.	 Show that E(uiuj 0 X1, X2, c, Xn) = E(ui 0 Xi) E(uj 0 Xj) for i j.

	 17.8	 Consider the regression model in Key Concept 17.1 and suppose that
                                  Assumptions #1, #2, #3, and #5 hold. Suppose that Assumption #4 is

                              replaced by the assumption that var(ui 0 Xi) = u0 + u1 0 Xi 0 , where 0 Xi 0 is

                                  the absolute value of Xi, u0 7 0, and u1 Ú 0.

	 a.	 Is the OLS estimator of b1 BLUE?
	 b.	 Suppose that u0 and u1 are known. What is the BLUE estimator of b1?
	 c.	 Derive the exact sampling distribution of the OLS estimator, bn1, con-

                                       ditional on X1, c, Xn.
	 d.	 Derive the exact sampling distribution of the WLS estimator (treating

                                       u0 and u1 as known) of b1, conditional on X1, c, Xn.
   740   741   742   743   744   745   746   747   748   749   750