Page 259 -
P. 259

258	 Chapter 6  Linear Regression with Multiple Regressors

	 6.10	 (Yi, X1i, X2i) satisfy the assumptions in Key Concept 6.4; in addition,
                                  var(ui ͉ X1i, X2i) = 4 and var(X1i) = 6. A random sample of size n = 400
                                  is drawn from the population.

	 a.	 Assume that X1 and X2 are uncorrelated. Compute the variance of bn1.
                                       [Hint: Look at Equation (6.17) in Appendix 6.2.]

	 b.	 Assume that corr(X1, X2) = 0.5. Compute the variance of bn1.

	 c.	 Comment on the following statements: “When X1 and X2 are corre-
                                       lated, the variance of bn1 is larger than it would be if X1 and X2 were
                                       uncorrelated. Thus, if you are interested in b1, it is best to leave X2
                                       out of the regression if it is correlated with X1.”

	 6.11	 (Requires calculus) Consider the regression model

                              Yi = b1X1i + b2X2i + ui

		for i = 1, c, n. (Notice that there is no constant term in the regression.)
                                  Following analysis like that used in Appendix (4.2):

	 a.	 Specify the least squares function that is minimized by OLS.

	 b.	 Compute the partial derivatives of the objective function with respect

        to b1 and b2.

	  c.	  Suppose that   g  n   1X1iX2i                       =  0. Show that bn1  =  g  n   1X1iYi  >  g  n   1X12i.
                          i=                                                           i=                i=

	  d.	  Suppose  that  g  n   1X1iX2i                             0. Derive an expression for bn1 as a func-
                          i=

        tion of the data (Yi, X1i, X2i), i = 1, c, n.

	 e.	 Suppose that the model includes an intercept: Yi = b0 + b1X1i +
                                       b2X2i + ui. Show that the least squares estimators satisfy bn0 =
                                       Y - bn1X1 - bn2X2.

	 f.	 As in (e), suppose that the model contains an intercept. Also
        sgupni =p1o(sXe1ti h-atXg1)ni(=Y1(iX-1i
                                                  -         X1)(X2i -        X2) = 0. Show that bn1   =
                                                 Y)               n          - X1)2. How does this    compare
                                                            >  g  i=  1(X1i

        to the OLS estimator of b1 from the regression that omits X2?

   Empirical Exercises

                         (Only two empirical exercises for this chapter are given in the text, but you can
                         find more on the text website, www.pearsonglobaleditions.com/Stock_Watson.)

	 E6.1	 Use the Birthweight_Smoking data set introduced in Empirical Exercise
                                  E5.3 to answer the following questions.
   254   255   256   257   258   259   260   261   262   263   264