fmin_bfgs(f, x0, fprime=None, args=(), gtol=1e-05, norm=inf, epsilon=1.4901161193847656e-08, maxiter=None, full_output=0, disp=1, retall=0, callback=None)
Optimize the function, f
, whose gradient is given by fprime
using the quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno (BFGS).
Objective function to be minimized.
Initial guess.
Gradient of f.
Extra arguments passed to f and fprime.
Gradient norm must be less than :None:None:`gtol`
before successful termination.
Order of norm (Inf is max, -Inf is min)
If fprime
is approximated, use this value for the step size.
An optional user-supplied function to call after each iteration. Called as callback(xk)
, where xk
is the current parameter vector.
Maximum number of iterations to perform.
If True, return fopt
, func_calls
, grad_calls
, and warnflag
in addition to xopt
.
Print convergence message if True.
Return a list of results at each iteration if True.
Parameters which minimize f, i.e., f(xopt) == fopt
.
Minimum value.
Value of gradient at minimum, f'(xopt), which should be near 0.
Value of 1/f''(xopt), i.e., the inverse Hessian matrix.
Number of function_calls made.
Number of gradient calls made.
The value of :None:None:`xopt`
at each iteration. Only returned if :None:None:`retall`
is True.
Minimize a function using the BFGS algorithm.
minimize
Interface to minimization algorithms for multivariate functions. See method='BFGS'
in particular.
>>> from scipy.optimize import fmin_bfgs
... def quadratic_cost(x, Q):
... return x @ Q @ x ...
>>> x0 = np.array([-3, -4])
... cost_weight = np.diag([1., 10.])
... # Note that a trailing comma is necessary for a tuple with single element
... fmin_bfgs(quadratic_cost, x0, args=(cost_weight,)) Optimization terminated successfully. Current function value: 0.000000 Iterations: 7 # may vary Function evaluations: 24 # may vary Gradient evaluations: 8 # may vary array([ 2.85169950e-06, -4.61820139e-07])
>>> def quadratic_cost_grad(x, Q):
... return 2 * Q @ x ...
>>> fmin_bfgs(quadratic_cost, x0, quadratic_cost_grad, args=(cost_weight,)) Optimization terminated successfully. Current function value: 0.000000 Iterations: 7 Function evaluations: 8 Gradient evaluations: 8 array([ 2.85916637e-06, -4.54371951e-07])See :
The following pages refer to to this document either explicitly or contain code examples using this.
scipy.optimize._optimize.fmin_bfgs
scipy.optimize._optimize.approx_fprime
Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.
Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)
SVG is more flexible but power hungry; and does not scale well to 50 + nodes.
All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them