basinhopping(func, x0, niter=100, T=1.0, stepsize=0.5, minimizer_kwargs=None, take_step=None, accept_test=None, callback=None, interval=50, disp=False, niter_success=None, seed=None, *, target_accept_rate=0.5, stepwise_factor=0.9)
Basin-hopping is a two-phase method that combines a global stepping algorithm with local minimization at each step. Designed to mimic the natural process of energy minimization of clusters of atoms, it works well for similar problems with "funnel-like, but rugged" energy landscapes .
As the step-taking, step acceptance, and minimization methods are all customizable, this function can also be used to implement other two-phase methods.
Basin-hopping is a stochastic algorithm which attempts to find the global minimum of a smooth scalar function of one or more variables . The algorithm in its current form was described by David Wales and Jonathan Doye http://www-wales.ch.cam.ac.uk/.
The algorithm is iterative with each cycle composed of the following features
random perturbation of the coordinates
local minimization
accept or reject the new coordinates based on the minimized function value
The acceptance test used here is the Metropolis criterion of standard Monte Carlo algorithms, although there are many other possibilities .
This global minimization method has been shown to be extremely efficient for a wide variety of problems in physics and chemistry. It is particularly useful when the function has many minima separated by large barriers. See the Cambridge Cluster Database http://www-wales.ch.cam.ac.uk/CCD.html for databases of molecular systems that have been optimized primarily using basin-hopping. This database includes minimization problems exceeding 300 degrees of freedom.
See the free software program GMIN (http://www-wales.ch.cam.ac.uk/GMIN) for a Fortran implementation of basin-hopping. This implementation has many different variations of the procedure described above, including more advanced step taking algorithms and alternate acceptance criterion.
For stochastic global optimization there is no way to determine if the true global minimum has actually been found. Instead, as a consistency check, the algorithm can be run from a number of different random starting points to ensure the lowest minimum found in each example has converged to the global minimum. For this reason, basinhopping
will by default simply run for the number of iterations niter
and return the lowest minimum found. It is left to the user to ensure that this is in fact the global minimum.
Choosing stepsize
: This is a crucial parameter in basinhopping
and depends on the problem being solved. The step is chosen uniformly in the region from x0-stepsize to x0+stepsize, in each dimension. Ideally, it should be comparable to the typical separation (in argument values) between local minima of the function being optimized. basinhopping
will, by default, adjust stepsize
to find an optimal value, but this may take many iterations. You will get quicker results if you set a sensible initial value for stepsize
.
Choosing T
: The parameter T
is the "temperature" used in the Metropolis criterion. Basinhopping steps are always accepted if func(xnew) < func(xold)
. Otherwise, they are accepted with probability:
exp( -(func(xnew) - func(xold)) / T )
So, for best results, T
should to be comparable to the typical difference (in function values) between local minima. (The height of "walls" between local minima is irrelevant.)
If T
is 0, the algorithm becomes Monotonic Basin-Hopping, in which all steps that increase energy are rejected.
Function to be optimized. args
can be passed as an optional item in the dict minimizer_kwargs
Initial guess.
The number of basin-hopping iterations. There will be a total of niter + 1
runs of the local minimizer.
The "temperature" parameter for the accept or reject criterion. Higher "temperatures" mean that larger jumps in function value will be accepted. For best results T
should be comparable to the separation (in function value) between local minima.
Maximum step size for use in the random displacement.
Extra keyword arguments to be passed to the local minimizer scipy.optimize.minimize()
Some important options could be:
method
method
args
args
Replace the default step-taking routine with this routine. The default step-taking routine is a random displacement of the coordinates, but other step-taking algorithms may be better for some systems. take_step
can optionally have the attribute take_step.stepsize
. If this attribute exists, then basinhopping
will adjust take_step.stepsize
in order to try to optimize the global minimum search.
Define a test which will be used to judge whether or not to accept the step. This will be used in addition to the Metropolis test based on "temperature" T
. The acceptable return values are True, False, or "force accept"
. If any of the tests return False then the step is rejected. If the latter, then this will override any other tests in order to accept the step. This can be used, for example, to forcefully escape from a local minimum that basinhopping
is trapped in.
A callback function which will be called for all minima found. x
and f
are the coordinates and function value of the trial minimum, and accept
is whether or not that minimum was accepted. This can be used, for example, to save the lowest N minima found. Also, callback
can be used to specify a user defined stop criterion by optionally returning True to stop the basinhopping
routine.
interval for how often to update the stepsize
Set to True to print status messages
Stop the run if the global minimum candidate remains the same for this number of iterations.
numpy.random.RandomState
}, optional
If seed
is None (or :None:None:`np.random`
), the numpy.random.RandomState
singleton is used. If seed
is an int, a new RandomState
instance is used, seeded with seed
. If seed
is already a Generator
or RandomState
instance then that instance is used. Specify seed
for repeatable minimizations. The random numbers generated with this seed only affect the default Metropolis :None:None:`accept_test`
and the default take_step
. If you supply your own take_step
and :None:None:`accept_test`
, and these functions use random number generation, then those functions are responsible for the state of their random number generator.
The target acceptance rate that is used to adjust the :None:None:`stepsize`
. If the current acceptance rate is greater than the target, then the :None:None:`stepsize`
is increased. Otherwise, it is decreased. Range is (0, 1). Default is 0.5.
The :None:None:`stepsize`
is multiplied or divided by this stepwise factor upon each update. Range is (0, 1). Default is 0.9.
The optimization result represented as a OptimizeResult
object. Important attributes are: x
the solution array, fun
the value of the function at the solution, and message
which describes the cause of the termination. The OptimizeResult
object returned by the selected minimizer at the lowest minimum is also contained within this object and can be accessed through the lowest_optimization_result
attribute. See OptimizeResult
for a description of other attributes.
Find the global minimum of a function using the basin-hopping algorithm.
minimize
The local minimization function called once for each basinhopping step. minimizer_kwargs
is passed to this routine.
The following example is a 1-D minimization problem, with many local minima superimposed on a parabola.
>>> from scipy.optimize import basinhopping
... func = lambda x: np.cos(14.5 * x - 0.3) + (x + 0.2) * x
... x0=[1.]
Basinhopping, internally, uses a local minimization algorithm. We will use the parameter minimizer_kwargs
to tell basinhopping which algorithm to use and how to set up that minimizer. This parameter will be passed to scipy.optimize.minimize()
.
>>> minimizer_kwargs = {"method": "BFGS"}
... ret = basinhopping(func, x0, minimizer_kwargs=minimizer_kwargs,
... niter=200)
... print("global minimum: x = %.4f, f(x0) = %.4f" % (ret.x, ret.fun)) global minimum: x = -0.1951, f(x0) = -1.0009
Next consider a 2-D minimization problem. Also, this time, we will use gradient information to significantly speed up the search.
>>> def func2d(x):
... f = np.cos(14.5 * x[0] - 0.3) + (x[1] + 0.2) * x[1] + (x[0] +
... 0.2) * x[0]
... df = np.zeros(2)
... df[0] = -14.5 * np.sin(14.5 * x[0] - 0.3) + 2. * x[0] + 0.2
... df[1] = 2. * x[1] + 0.2
... return f, df
We'll also use a different local minimization algorithm. Also, we must tell the minimizer that our function returns both energy and gradient (Jacobian).
>>> minimizer_kwargs = {"method":"L-BFGS-B", "jac":True}
... x0 = [1.0, 1.0]
... ret = basinhopping(func2d, x0, minimizer_kwargs=minimizer_kwargs,
... niter=200)
... print("global minimum: x = [%.4f, %.4f], f(x0) = %.4f" % (ret.x[0],
... ret.x[1],
... ret.fun)) global minimum: x = [-0.1951, -0.1000], f(x0) = -1.0109
Here is an example using a custom step-taking routine. Imagine you want the first coordinate to take larger steps than the rest of the coordinates. This can be implemented like so:
>>> class MyTakeStep:
... def __init__(self, stepsize=0.5):
... self.stepsize = stepsize
... self.rng = np.random.default_rng()
... def __call__(self, x):
... s = self.stepsize
... x[0] += self.rng.uniform(-2.*s, 2.*s)
... x[1:] += self.rng.uniform(-s, s, x[1:].shape)
... return x
Since MyTakeStep.stepsize
exists basinhopping will adjust the magnitude of stepsize
to optimize the search. We'll use the same 2-D function as before
>>> mytakestep = MyTakeStep()
... ret = basinhopping(func2d, x0, minimizer_kwargs=minimizer_kwargs,
... niter=200, take_step=mytakestep)
... print("global minimum: x = [%.4f, %.4f], f(x0) = %.4f" % (ret.x[0],
... ret.x[1],
... ret.fun)) global minimum: x = [-0.1951, -0.1000], f(x0) = -1.0109
Now, let's do an example using a custom callback function which prints the value of every minimum found
>>> def print_fun(x, f, accepted):
... print("at minimum %.4f accepted %d" % (f, int(accepted)))
We'll run it for only 10 basinhopping steps this time.
>>> rng = np.random.default_rng()
... ret = basinhopping(func2d, x0, minimizer_kwargs=minimizer_kwargs,
... niter=10, callback=print_fun, seed=rng) at minimum 0.4159 accepted 1 at minimum -0.4317 accepted 1 at minimum -1.0109 accepted 1 at minimum -0.9073 accepted 1 at minimum -0.4317 accepted 0 at minimum -0.1021 accepted 1 at minimum -0.7425 accepted 1 at minimum -0.9073 accepted 1 at minimum -0.4317 accepted 0 at minimum -0.7425 accepted 1 at minimum -0.9073 accepted 1
The minimum at -1.0109 is actually the global minimum, found already on the 8th iteration.
Now let's implement bounds on the problem using a custom accept_test
:
>>> class MyBounds:
... def __init__(self, xmax=[1.1,1.1], xmin=[-1.1,-1.1] ):
... self.xmax = np.array(xmax)
... self.xmin = np.array(xmin)
... def __call__(self, **kwargs):
... x = kwargs["x_new"]
... tmax = bool(np.all(x <= self.xmax))
... tmin = bool(np.all(x >= self.xmin))
... return tmax and tmin
>>> mybounds = MyBounds()See :
... ret = basinhopping(func2d, x0, minimizer_kwargs=minimizer_kwargs,
... niter=10, accept_test=mybounds)
The following pages refer to to this document either explicitly or contain code examples using this.
scipy.optimize._optimize.brute
scipy.optimize._minimize.minimize
scipy.optimize._basinhopping.basinhopping
papyri
Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.
Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)
SVG is more flexible but power hungry; and does not scale well to 50 + nodes.
All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them