get(dsk, keys, num_workers=None, func_loads=None, func_dumps=None, optimize_graph=True, pool=None, chunksize=None, **kwargs)
dask graph
Desired results from graph
Number of worker processes (defaults to number of cores)
Function to use for function serialization (defaults to cloudpickle.dumps)
Function to use for function deserialization (defaults to cloudpickle.loads)
If True [default], fuse
is applied to the graph before computation.
Some sort of :None:None:`Executor`
or :None:None:`Pool`
to use
Size of chunks to use when dispatching work. Defaults to 5 as some batching is helpful. If -1, will be computed to evenly divide ready work across workers.
Multiprocessed get function appropriate for Bags
Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.
Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)
SVG is more flexible but power hungry; and does not scale well to 50 + nodes.
All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them