scipy 1.8.0 Pypi GitHub Homepage
Other Docs
NotesParameters

This class defines a scalar function F: R^n->R and methods for computing or approximating its first and second derivatives.

Notes

This class implements a memoization logic. There are methods :None:None:`fun`, grad , hess` and corresponding attributes f, g and :None:None:`H`. The following things should be considered:

  1. Use only public methods :None:None:`fun`, grad and :None:None:`hess`.

  2. After one of the methods is called, the corresponding attribute will be set. However, a subsequent call with a different argument of any of the methods may overwrite the attribute.

Parameters

fun : callable

evaluates the scalar function. Must be of the form fun(x, *args) , where x is the argument in the form of a 1-D array and args is a tuple of any additional fixed parameters needed to completely specify the function. Should return a scalar.

x0 : array-like

Provides an initial set of variables for evaluating fun. Array of real elements of size (n,), where 'n' is the number of independent variables.

args : tuple, optional

Any additional fixed parameters needed to completely specify the scalar function.

grad : {callable, '2-point', '3-point', 'cs'}

Method for computing the gradient vector. If it is a callable, it should be a function that returns the gradient vector:

grad(x, *args) -> array_like, shape (n,)

where x is an array with shape (n,) and args is a tuple with the fixed parameters. Alternatively, the keywords {'2-point', '3-point', 'cs'} can be used to select a finite difference scheme for numerical estimation of the gradient with a relative step size. These finite difference schemes obey any specified :None:None:`bounds`.

hess : {callable, '2-point', '3-point', 'cs', HessianUpdateStrategy}

Method for computing the Hessian matrix. If it is callable, it should return the Hessian matrix:

hess(x, *args) -> {LinearOperator, spmatrix, array}, (n, n)

where x is a (n,) ndarray and :None:None:`args` is a tuple with the fixed parameters. Alternatively, the keywords {'2-point', '3-point', 'cs'} select a finite difference scheme for numerical estimation. Or, objects implementing HessianUpdateStrategy interface can be used to approximate the Hessian. Whenever the gradient is estimated via finite-differences, the Hessian cannot be estimated with options {'2-point', '3-point', 'cs'} and needs to be estimated using one of the quasi-Newton strategies.

finite_diff_rel_step : None or array_like

Relative step size to use. The absolute step size is computed as h = finite_diff_rel_step * sign(x0) * max(1, abs(x0)) , possibly adjusted to fit into the bounds. For method='3-point' the sign of h is ignored. If None then finite_diff_rel_step is selected automatically,

finite_diff_bounds : tuple of array_like

Lower and upper bounds on independent variables. Defaults to no bounds, (-np.inf, np.inf). Each bound must match the size of :None:None:`x0` or be a scalar, in the latter case the bound will be the same for all variables. Use it to limit the range of function evaluation.

epsilon : None or array_like, optional

Absolute step size to use, possibly adjusted to fit into the bounds. For method='3-point' the sign of :None:None:`epsilon` is ignored. By default relative steps are used, only if epsilon is not None are absolute steps used.

Scalar function and its derivatives.

Examples

See :

Local connectivity graph

Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.

Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)

SVG is more flexible but power hungry; and does not scale well to 50 + nodes.

All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them


GitHub : /scipy/optimize/_differentiable_functions.py#11
type: <class 'type'>
Commit: