Go to the first, previous, next, last section, table of contents.

Algorithms using Derivatives

The root finding algorithms described in this section make use of both the function and its derivative. They require an initial guess for the location of the root, but there is no absolute guarantee of convergence -- the function must be suitable for this technique and the initial guess must be sufficiently close to the root for it to work. When the conditions are satisfied then convergence is quadratic.

Derivative Solver: gsl_multiroot_fdfsolver_hybridsj
This is a modified version of Powell's Hybrid method as implemented in the HYBRJ algorithm in MINPACK. Minpack was written by Jorge J. Mor'e, Burton S. Garbow and Kenneth E. Hillstrom. The Hybrid algorithm retains the fast convergence of Newton's method but will also reduce the residual when Newton's method is unreliable.

The algorithm uses a generalized trust region to keep each step under control. In order to be accepted a proposed new position @math{x'} must satisfy the condition @math{|D (x' - x)| < \delta}, where @math{D} is a diagonal scaling matrix and @math{\delta} is the size of the trust region. The components of @math{D} are computed internally, using the column norms of the Jacobian to estimate the sensitivity of the residual to each component of @math{x}. This improves the behavior of the algorithm for badly scaled functions.

On each iteration the algorithm first determines the standard Newton step by solving the system @math{J dx = - f}. If this step falls inside the trust region it is used as a trial step in the next stage. If not, the algorithm uses the linear combination of the Newton and gradient directions which is predicted to minimize the norm of the function while staying inside the trust region.

This combination of Newton and gradient directions is referred to as a dogleg step.

The proposed step is now tested by evaluating the function at the resulting point, @math{x'}. If the step reduces the norm of the function sufficiently then it is accepted and size of the trust region is increased. If the proposed step fails to improve the solution then the size of the trust region is decreased and another trial step is computed.

The speed of the algorithm is increased by computing the changes to the Jacobian approximately, using a rank-1 update. If two successive attempts fail to reduce the residual then the full Jacobian is recomputed. The algorithm also monitors the progress of the solution and returns an error if several steps fail to make any improvement,

GSL_ENOPROG
the iteration is not making any progress, preventing the algorithm from continuing.
GSL_ENOPROGJ
re-evaluations of the Jacobian indicate that the iteration is not making any progress, preventing the algorithm from continuing.

Derivative Solver: gsl_multiroot_fdfsolver_hybridj
This algorithm is an unscaled version of hybridsj. The steps are controlled by a spherical trust region @math{|x' - x| < \delta}, instead of a generalized region. This can be useful if the generalized region estimated by hybridsj is inappropriate.

Derivative Solver: gsl_multiroot_fdfsolver_newton

Newton's Method is the standard root-polishing algorithm. The algorithm begins with an initial guess for the location of the solution. On each iteration a linear approximation to the function @math{F} is used to estimate the step which will zero all the components of the residual. The iteration is defined by the following sequence,

where the Jacobian matrix @math{J} is computed from the derivative functions provided by f. The step @math{dx} is obtained by solving the linear system,

using LU decomposition.

Derivative Solver: gsl_multiroot_fdfsolver_gnewton
This is a modified version of Newton's method which attempts to improve global convergence by requiring every step to reduce the Euclidean norm of the residual, @math{|f(x)|}. If the Newton step leads to an increase in the norm then a reduced step of relative size,

is proposed, with @math{r} being the ratio of norms @math{|f(x')|^2/|f(x)|^2}. This procedure is repeated until a suitable step size is found.


Go to the first, previous, next, last section, table of contents.