scipy 1.8.0 Pypi GitHub Homepage
Other Docs
NotesParametersReturnsBackRef
line_search_wolfe2(f, myfprime, xk, pk, gfk=None, old_fval=None, old_old_fval=None, args=(), c1=0.0001, c2=0.9, amax=None, extra_condition=None, maxiter=10)

Notes

Uses the line search algorithm to enforce strong Wolfe conditions. See Wright and Nocedal, 'Numerical Optimization', 1999, pp. 59-61.

Parameters

f : callable f(x,*args)

Objective function.

myfprime : callable f'(x,*args)

Objective function gradient.

xk : ndarray

Starting point.

pk : ndarray

Search direction.

gfk : ndarray, optional

Gradient value for x=xk (xk being the current parameter estimate). Will be recomputed if omitted.

old_fval : float, optional

Function value for x=xk. Will be recomputed if omitted.

old_old_fval : float, optional

Function value for the point preceding x=xk.

args : tuple, optional

Additional arguments passed to objective function.

c1 : float, optional

Parameter for Armijo condition rule.

c2 : float, optional

Parameter for curvature condition rule.

amax : float, optional

Maximum step size

extra_condition : callable, optional

A callable of the form extra_condition(alpha, x, f, g) returning a boolean. Arguments are the proposed step alpha and the corresponding x , f and g values. The line search accepts the value of alpha only if this callable returns True . If the callable returns False for the step length, the algorithm will continue with new iterates. The callable is only called for iterates satisfying the strong Wolfe conditions.

maxiter : int, optional

Maximum number of iterations to perform.

Returns

alpha : float or None

Alpha for which x_new = x0 + alpha * pk , or None if the line search algorithm did not converge.

fc : int

Number of function evaluations made.

gc : int

Number of gradient evaluations made.

new_fval : float or None

New function value f(x_new)=f(x0+alpha*pk) , or None if the line search algorithm did not converge.

old_fval : float

Old function value f(x0) .

new_slope : float or None

The local slope along the search direction at the new value <myfprime(x_new), pk> , or None if the line search algorithm did not converge.

Find alpha that satisfies strong Wolfe conditions.

Examples

>>> from scipy.optimize import line_search

A objective function and its gradient are defined.

>>> def obj_func(x):
...  return (x[0])**2+(x[1])**2
... def obj_grad(x):
...  return [2*x[0], 2*x[1]]

We can find alpha that satisfies strong Wolfe conditions.

>>> start_point = np.array([1.8, 1.7])
... search_gradient = np.array([-1.0, -1.0])
... line_search(obj_func, obj_grad, start_point, search_gradient) (1.0, 2, 1, 1.1300000000000001, 6.13, [1.6, 1.4])
See :

Back References

The following pages refer to to this document either explicitly or contain code examples using this.

scipy.optimize._linesearch.line_search_wolfe2

Local connectivity graph

Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.

Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)

SVG is more flexible but power hungry; and does not scale well to 50 + nodes.

All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them


GitHub : /scipy/optimize/_linesearch.py#181
type: <class 'function'>
Commit: