Page 109 -
P. 109

108	 Chapter 2  Review of Probability

	 2.24	Suppose Yi is distributed i.i.d. N(0, s2) for i = 1, 2, c, n.

	  a.	  Show that                                           E(Y   2  >  s2)  =   1.
                                                                  i

	  b.	  Show that                                           W  =     (1  >  s2)  g  n   1Y  2  is distributed  xn2.
                                                                                    i=      i

	 c.	 Show that E(W) = n. [Hint: Use your answer to (a).]

	  d.	  Show that                                           V  =  Y1 n           g  n   2Yi2   is distributed  tn - 1.
                                                                                    i=

                                                                             B   n-1

	 2.25	 (Review of summation notation) Let x1, c, xn denote a sequence of
                                  numbers, y1, c, yn denote another sequence of numbers, and a, b, and c
                                  denote three constants. Show that

                                                         nn

	 a.	 ia= 1axi = aia= 1xi

                                                         n nn

	 b.	 ia= 1(xi + yi) = ia= 1xi + ia= 1yi

                                                         n

	 c.	 ia= 1a = na

        n                                                                                      nn                       n
	 d.	 ia= 1(a + bxi + cyi)2 = na2 + b2ia= 1x2i + c2ia= 1yi2 + 2abia= 1xi +

                                                                nn

		2acia= 1yi + 2bcia= 1xiyi

	 2.26	 Suppose that Y1, Y2, c, Yn are random variables with a common mean mY,
                                  a common variance sY2, and the same correlation r (so that the correlation
                                  between Yi and Yj is equal to r for all pairs i and j, where i j).

	 a.	 Show that cov(Yi, Yj) = rsY2 for i j.
	 b.	 Suppose that n = 2. Show that E(Y ) = mY and var(Y ) = 12s2Y + 21rs2Y.
	 c.	For n Ú 2, show that E(Y ) = mY and var(Y ) = sY2 >n +

                                       [(n - 1)>n]rs2Y.
	 d.	When n is very large, show that var(Y ) ≈ rs2Y.

	 2.27	 X and Z are two jointly distributed random variables. Suppose you know

                                  the value of Z, but not the value of X. Let X = E(X ͉ Z) denote a guess
                                  of the value of X using the information on Z, and let W = X - X denote
                                  the error associated with this guess.

	 a.	 Show that E(W ) = 0. (Hint: Use the law of iterated expectations.)
	 b.	 Show that E(WZ ) = 0.
   104   105   106   107   108   109   110   111   112   113   114