This allows multiple clients to share futures and data between each other with a single mutable variable. All metadata is sequentialized through the scheduler. Race conditions can occur.
Values must be either Futures or msgpack-encodable data (ints, lists, strings, etc..) All data will be kept and sent through the scheduler, so it is wise not to send too much. If you want to share a large amount of data then scatter
it and share the future instead.
This object is experimental and has known issues in Python 2
Name used by other clients and the scheduler to identify the variable. If not given, a random name will be generated.
Client used for communication with the scheduler. If not given, the default global client will be used.
Distributed Global Variable
Queue
shared multi-producer/multi-consumer queue between clients
>>> from dask.distributed import Client, Variable # doctest: +SKIPThis example is valid syntax, but we were not able to check execution
... client = Client() # doctest: +SKIP
... x = Variable('x') # doctest: +SKIP
... x.set(123) # docttest: +SKIP
... x.get() # docttest: +SKIP 123
>>> future = client.submit(f, x) # doctest: +SKIPSee :
... x.set(future) # doctest: +SKIP
The following pages refer to to this document either explicitly or contain code examples using this.
distributed.queues.Queue
distributed.variable.Variable
Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.
Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)
SVG is more flexible but power hungry; and does not scale well to 50 + nodes.
All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them