fmin_ncg(f, x0, fprime, fhess_p=None, fhess=None, args=(), avextol=1e-05, epsilon=1.4901161193847656e-08, maxiter=None, full_output=0, disp=1, retall=0, callback=None)
Only one of fhess_p
or fhess
need to be given. If fhess
is provided, then fhess_p
will be ignored. If neither fhess
nor fhess_p
is provided, then the hessian product will be approximated using finite differences on fprime
. fhess_p
must compute the hessian times an arbitrary vector. If it is not given, finite-differences on fprime
are used to compute it.
Newton-CG methods are also called truncated Newton methods. This function differs from scipy.optimize.fmin_tnc because
scipy.optimize.fmin_ncg is written purely in Python using NumPy
and scipy while scipy.optimize.fmin_tnc calls a C function.
scipy.optimize.fmin_ncg is only for unconstrained minimization
while scipy.optimize.fmin_tnc is for unconstrained minimization or box constrained minimization. (Box constraints give lower and upper bounds for each variable separately.)
Objective function to be minimized.
Initial guess.
Gradient of f.
Function which computes the Hessian of f times an arbitrary vector, p.
Function to compute the Hessian matrix of f.
Extra arguments passed to f, fprime, fhess_p, and fhess (the same set of extra arguments is supplied to all of these functions).
If fhess is approximated, use this value for the step size.
An optional user-supplied function which is called after each iteration. Called as callback(xk), where xk is the current parameter vector.
Convergence is assumed when the average relative error in the minimizer falls below this amount.
Maximum number of iterations to perform.
If True, return the optional outputs.
If True, print convergence message.
If True, return a list of results at each iteration.
Parameters which minimize f, i.e., f(xopt) == fopt
.
Value of the function at xopt, i.e., fopt = f(xopt)
.
Number of function calls made.
Number of gradient calls made.
Number of Hessian calls made.
Warnings generated by the algorithm. 1 : Maximum number of iterations exceeded. 2 : Line search failure (precision loss). 3 : NaN result encountered.
The result at each iteration, if retall is True (see below).
Unconstrained minimization of a function using the Newton-CG method.
minimize
Interface to minimization algorithms for multivariate functions. See the 'Newton-CG' :None:None:`method`
in particular.
Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.
Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)
SVG is more flexible but power hungry; and does not scale well to 50 + nodes.
All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them