scipy 1.8.0 Pypi GitHub Homepage
Other Docs
ParametersReturnsBackRef
check_grad(func, grad, x0, *args, epsilon=1.4901161193847656e-08, direction='all', seed=None)

Parameters

func : callable ``func(x0, *args)``

Function whose derivative is to be checked.

grad : callable ``grad(x0, *args)``

Gradient of :None:None:`func`.

x0 : ndarray

Points to check grad against forward difference approximation of grad using :None:None:`func`.

args : \*args, optional

Extra arguments passed to :None:None:`func` and grad .

epsilon : float, optional

Step size used for the finite difference approximation. It defaults to sqrt(np.finfo(float).eps) , which is approximately 1.49e-08.

direction : str, optional

If set to 'random' , then gradients along a random vector are used to check grad against forward difference approximation using :None:None:`func`. By default it is 'all' , in which case, all the one hot direction vectors are considered to check grad .

seed : {None, int, `numpy.random.Generator`,

numpy.random.RandomState }, optional

If seed is None (or :None:None:`np.random`), the numpy.random.RandomState singleton is used. If seed is an int, a new RandomState instance is used, seeded with seed . If seed is already a Generator or RandomState instance then that instance is used. Specify seed for reproducing the return value from this function. The random numbers generated with this seed affect the random vector along which gradients are computed to check grad . Note that seed is only used when :None:None:`direction` argument is set to :None:None:`'random'`.

Returns

err : float

The square root of the sum of squares (i.e., the 2-norm) of the difference between grad(x0, *args) and the finite difference approximation of grad using func at the points :None:None:`x0`.

Check the correctness of a gradient function by comparing it against a (forward) finite-difference approximation of the gradient.

See Also

approx_fprime

Examples

>>> def func(x):
...  return x[0]**2 - 0.5 * x[1]**3
... def grad(x):
...  return [2 * x[0], -1.5 * x[1]**2]
... from scipy.optimize import check_grad
... check_grad(func, grad, [1.5, -1.5]) 2.9802322387695312e-08 # may vary
>>> rng = np.random.default_rng()
... check_grad(func, grad, [1.5, -1.5],
...  direction='random', seed=rng) 2.9802322387695312e-08
See :

Back References

The following pages refer to to this document either explicitly or contain code examples using this.

scipy.optimize._optimize.check_grad scipy.optimize._optimize.approx_fprime

Local connectivity graph

Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.

Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)

SVG is more flexible but power hungry; and does not scale well to 50 + nodes.

All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them


GitHub : /scipy/optimize/_optimize.py#983
type: <class 'function'>
Commit: