dask 2021.10.0

Parameters
get(dsk, keys, num_workers=None, func_loads=None, func_dumps=None, optimize_graph=True, pool=None, chunksize=None, **kwargs)

Parameters

dsk : dict

dask graph

keys : object or list

Desired results from graph

num_workers : int

Number of worker processes (defaults to number of cores)

func_dumps : function

Function to use for function serialization (defaults to cloudpickle.dumps)

func_loads : function

Function to use for function deserialization (defaults to cloudpickle.loads)

optimize_graph : bool

If True [default], fuse is applied to the graph before computation.

pool : Executor or Pool

Some sort of :None:None:`Executor` or :None:None:`Pool` to use

chunksize: int, optional :

Size of chunks to use when dispatching work. Defaults to 5 as some batching is helpful. If -1, will be computed to evenly divide ready work across workers.

Multiprocessed get function appropriate for Bags

Examples

See :

Local connectivity graph

Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.

Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)

SVG is more flexible but power hungry; and does not scale well to 50 + nodes.

All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them


File: /dask/multiprocessing.py#144
type: <class 'function'>
Commit: