All of the models we have discussed thus far have been linear in the parameters (i.e., linear in the beta's). For example, polynomial regression was used to model curvature in our data by using higher-ordered values of the predictors. However, the final regression model was just a linear combination of higher-ordered predictors.
Now we are interested in studying the nonlinear regression model:
where X is a vector of p predictors,
However, there are some nonlinear models which are actually called intrinsically linear because they can be made linear in the parameters by a simple transformation. For example:
can be rewritten as
which is linear in the transformed parameters
Nonlinear Least Squares
Returning to cases in which it is not possible to transform the model to a linear form, consider the setting
where the
First, let
In order to find
we first find each of the partial derivatives of Q with respect to
Algorithms for nonlinear least squares estimation include:
- Newton's method
- is a classical method based on a gradient approach but which can be computationally challenging and heavily dependent on good starting values.
- Gauss-Newton algorithm
- is a modification of Newton's method that gives a good approximation of the solution that Newton's method should have arrived at but which is not guaranteed to converge.
- Levenberg-Marquardt method
- can take care of computational difficulties arising from the other methods but can require a tedious search for the optimal value of a tuning parameter.