scipy 1.8.0 Pypi GitHub Homepage
Other Docs
ParametersReturns
_prepare_scalar_function(fun, x0, jac=None, args=(), bounds=None, epsilon=None, finite_diff_rel_step=None, hess=None)

Parameters

fun : callable

The objective function to be minimized.

fun(x, *args) -> float

where x is an 1-D array with shape (n,) and args is a tuple of the fixed parameters needed to completely specify the function.

x0 : ndarray, shape (n,)

Initial guess. Array of real elements of size (n,), where 'n' is the number of independent variables.

jac : {callable, '2-point', '3-point', 'cs', None}, optional

Method for computing the gradient vector. If it is a callable, it should be a function that returns the gradient vector:

jac(x, *args) -> array_like, shape (n,)

If one of :None:None:`{'2-point', '3-point', 'cs'}` is selected then the gradient is calculated with a relative step for finite differences. If :None:None:`None`, then two-point finite differences with an absolute step is used.

args : tuple, optional

Extra arguments passed to the objective function and its derivatives (:None:None:`fun`, :None:None:`jac` functions).

bounds : sequence, optional

Bounds on variables. 'new-style' bounds are required.

eps : float or ndarray

If :None:None:`jac is None` the absolute step size used for numerical approximation of the jacobian via forward differences.

finite_diff_rel_step : None or array_like, optional

If :None:None:`jac in ['2-point', '3-point', 'cs']` the relative step size to use for numerical approximation of the jacobian. The absolute step size is computed as h = rel_step * sign(x0) * max(1, abs(x0)) , possibly adjusted to fit into the bounds. For method='3-point' the sign of h is ignored. If None (default) then step is selected automatically.

hess : {callable, '2-point', '3-point', 'cs', None}

Computes the Hessian matrix. If it is callable, it should return the Hessian matrix:

hess(x, *args) -> {LinearOperator, spmatrix, array}, (n, n)

Alternatively, the keywords {'2-point', '3-point', 'cs'} select a finite difference scheme for numerical estimation. Whenever the gradient is estimated via finite-differences, the Hessian cannot be estimated with options {'2-point', '3-point', 'cs'} and needs to be estimated using one of the quasi-Newton strategies.

Returns

sf : ScalarFunction

Creates a ScalarFunction object for use with scalar minimizers (BFGS/LBFGSB/SLSQP/TNC/CG/etc).

Examples

See :

Local connectivity graph

Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.

Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)

SVG is more flexible but power hungry; and does not scale well to 50 + nodes.

All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them


GitHub : /scipy/optimize/_optimize.py#175
type: <class 'function'>
Commit: