Page 308 -
P. 308
8.1 A General Strategy for Modeling Nonlinear Regression Functions 307
Figure 8.3 Scatterplot of Test Score vs. District Income with Linear and Quadratic Regression Functions
The quadratic OLS regression function fits the data Test score Linear regression
better than the linear OLS regression function. 740
720
700
680 Quadratic regression
660
640
620
600 10 20 30 40 50 60
0
District income
(thousands of dollars)
hypothesis. This t-statistic is t = (bn2 - 0)>SE(bn2), which from Equation (8.2) is
t = -0.0423>0.0048 = -8.81. In absolute value, this exceeds the 5% critical
value of this test (which is 1.96). Indeed the p-value for the t-statistic is less than
0.01%, so we can reject the hypothesis that b2 = 0 at all conventional significance
levels. Thus this formal hypothesis test supports our informal inspection of Fig-
ures 8.2 and 8.3: The quadratic model fits the data better than the linear model.
The Effect on Y of a Change in X
in Nonlinear Specifications
Put aside the test scoreEelxeacmtropnleicfoPruablmishoimngenStearnvdicecosnInsicd.er a general problem.
You want to know howSthtoecdke/Wpeantdseonnt, vEarcioanbolemYetirsicesxp1eected to change when the
independent variable XS1TcOhCa.nITgeEsMb.y00t2h3e amount ∆X1, holding constant other
independent variables FXi2g,.c06.,0X3k. When the population regression function is
linear, this effect is easy to calculate: As shown in Equation (6.4), the expected
change in Y is ∆Y = b1∆X11, swthPerroeobf 1 is the 2pnodpuPlraotoiof n regres3sridonPrcoooeffficient Final
multiplying X1. When the regression function is nonlinear, however, the expected
change in Y is more complicated to calculate because it can depend on the values
of the independent variables.

