Page 358 -
P. 358
Regression Functions That Are Nonlinear in the Parameters 357
and decreases as income rises, and has an upper bound (that is, an asymptote as income
increases to infinity). The negative exponential growth regression model is
Yi = b031 - e-b1(Xi - b2)4 + ui. (8.39)
The negative exponential growth function is graphed in Figure 8.12b. The slope is steep for
low values of X, but as X increases, it reaches an asymptote of b0.
General functions that are nonlinear in the parameters. The logistic and negative exponen-
tial growth regression models are special cases of the general nonlinear regression model
Yi = f(X1i, c, Xki; b0, c, bm) + ui, (8.40)
in which there are k independent variables and m + 1 parameters, b0, c, bm. In the mod-
els of Sections 8.2 and 8.3, the X’s entered this function nonlinearly, but the parameters
entered linearly. In the examples of this appendix, the parameters enter nonlinearly as well.
If the parameters are known, then predicted effects can be computed using the method
described in Section 8.1. In applications, however, the parameters are unknown and must
be estimated from the data. Parameters that enter nonlinearly cannot be estimated by
OLS, but they can be estimated by nonlinear least squares.
Nonlinear Least Squares Estimation
Nonlinear least squares is a general method for estimating the unknown parameters of
a regression function when those parameters enter the population regression function
nonlinearly.
Recall the discussion in Section 5.3 of the OLS estimator of the coefficients of the
linear multiple regression model. The OLS estimator minimizes the sum of squared predic-
tion mistakes in Equation (5.8), g n 13Yi - (b0 + b1X1i + g+ bkXki)42. In principle, the
i=
OLS estimator can be computed by checking many trial values of b0, c, bk and settling
on the values that minimize the sum of squared mistakes.
This same approach can be used to estimate the parameters of the general nonlinear
regression model in Equation (8.40). Because the regression function is nonlinear in the
coefficients, this method is called nonlinear least squares. For a set of trial parameter
values b0, b1, c, bm construct the sum of squared prediction mistakes:
n
ia= 1 3Yi - f(X1i, c, Xki, b1, c, bm)42. (8.41)
The nonlinear least squares estimators of b0, b1, c, bm are the values of b0, b1, c, bm that
minimize the sum of squared prediction mistakes in Equation (8.41).

