pandas 1.4.2

NotesParametersReturns
read_pickle(filepath_or_buffer: 'FilePath | ReadPickleBuffer', compression: 'CompressionOptions' = 'infer', storage_options: 'StorageOptions' = None)
warning

Loading pickled data received from untrusted sources can be unsafe. See :None:None:`here <https://docs.python.org/3/library/pickle.html>`.

Notes

read_pickle is only guaranteed to be backwards compatible to pandas 0.20.3.

Parameters

filepath_or_buffer : str, path object, or file-like object

String, path object (implementing os.PathLike[str] ), or file-like object implementing a binary readlines() function.

versionchanged

Accept URL. URL is not limited to S3 and GCS.

compression : str or dict, default 'infer'

For on-the-fly decompression of on-disk data. If 'infer' and 'filepath_or_buffer' is path-like, then detect compression from the following extensions: '.gz', '.bz2', '.zip', '.xz', or '.zst' (otherwise no compression). If using 'zip', the ZIP file must contain only one data file to be read in. Set to None for no decompression. Can also be a dict with key 'method' set to one of { 'zip' , 'gzip' , 'bz2' , 'zstd' } and other key-value pairs are forwarded to zipfile.ZipFile , gzip.GzipFile , bz2.BZ2File , or zstandard.ZstdDecompressor , respectively. As an example, the following could be passed for Zstandard decompression using a custom compression dictionary: compression={'method': 'zstd', 'dict_data': my_compression_dict} .

versionchanged
storage_options : dict, optional

Extra options that make sense for a particular storage connection, e.g. host, port, username, password, etc. For HTTP(S) URLs the key-value pairs are forwarded to urllib as header options. For other URLs (e.g. starting with "s3://", and "gcs://") the key-value pairs are forwarded to fsspec . Please see fsspec and urllib for more details.

versionadded

Returns

unpickled : same type as object stored in file

Load pickled pandas object (or any object) from file.

See Also

DataFrame.to_pickle

Pickle (serialize) DataFrame object to file.

Series.to_pickle

Pickle (serialize) Series object to file.

read_hdf

Read HDF5 file into a DataFrame.

read_parquet

Load a parquet object, returning a DataFrame.

read_sql

Read SQL query or database table into a DataFrame.

Examples

This example is valid syntax, but we were not able to check execution
>>> original_df = pd.DataFrame({"foo": range(5), "bar": range(5, 10)})  # doctest: +SKIP
... original_df # doctest: +SKIP foo bar 0 0 5 1 1 6 2 2 7 3 3 8 4 4 9
This example is valid syntax, but we were not able to check execution
>>> pd.to_pickle(original_df, "./dummy.pkl")  # doctest: +SKIP
This example is valid syntax, but we were not able to check execution
>>> unpickled_df = pd.read_pickle("./dummy.pkl")  # doctest: +SKIP
... unpickled_df # doctest: +SKIP foo bar 0 0 5 1 1 6 2 2 7 3 3 8 4 4 9
See :

Local connectivity graph

Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.

Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)

SVG is more flexible but power hungry; and does not scale well to 50 + nodes.

All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them


File: /pandas/io/pickle.py#115
type: <class 'function'>
Commit: