pandas 1.4.2

ParametersBackRef

Either Fixed or Table format.

warning

Pandas uses PyTables for reading and writing HDF5 files, which allows serializing object-dtype data with pickle when using the "fixed" format. Loading pickled data received from untrusted sources can be unsafe.

See: https://docs.python.org/3/library/pickle.html for more.

Parameters

path : str

File path to HDF5 file.

mode : {'a', 'w', 'r', 'r+'}, default 'a'

'r'

Read-only; no data can be modified.

'w'

Write; a new file is created (an existing file with the same name would be deleted).

'a'

Append; an existing file is opened for reading and writing, and if the file does not exist it is created.

'r+'

It is similar to 'a' , but the file must already exist.

complevel : int, 0-9, default None

Specifies a compression level for data. A value of 0 or None disables compression.

complib : {'zlib', 'lzo', 'bzip2', 'blosc'}, default 'zlib'

Specifies the compression library to be used. As of v0.20.2 these additional compressors for Blosc are supported (default if no compressor specified: 'bloscblosclz'): {'blosc:blosclz', 'blosc:lz4', 'blosc:lz4hc', 'blosc:snappy', 'blosc:zlib', 'blosc:zstd'}. Specifying a compression library which is not available issues a ValueError.

fletcher32 : bool, default False

If applying compression use the fletcher32 checksum.

**kwargs :

These parameters will be passed to the PyTables open_file method.

Dict-like IO interface for storing pandas objects in PyTables.

Examples

This example is valid syntax, but we were not able to check execution
>>> bar = pd.DataFrame(np.random.randn(10, 4))
... store = pd.HDFStore('test.h5')
... store['foo'] = bar # write to HDF5
... bar = store['foo'] # retrieve
... store.close()

Create or load HDF5 file in-memory

When passing the :None:None:`driver` option to the PyTables open_file method through **kwargs, the HDF5 file is loaded or created in-memory and will only be written when closed:

This example is valid syntax, but we were not able to check execution
>>> bar = pd.DataFrame(np.random.randn(10, 4))
... store = pd.HDFStore('test.h5', driver='H5FD_CORE')
... store['foo'] = bar
... store.close() # only now, data is written to disk
See :

Back References

The following pages refer to to this document either explicitly or contain code examples using this.

pandas.io.pytables.read_hdf

Local connectivity graph

Hover to see nodes names; edges to Self not shown, Caped at 50 nodes.

Using a canvas is more power efficient and can get hundred of nodes ; but does not allow hyperlinks; , arrows or text (beyond on hover)

SVG is more flexible but power hungry; and does not scale well to 50 + nodes.

All aboves nodes referred to, (or are referred from) current nodes; Edges from Self to other have been omitted (or all nodes would be connected to the central node "self" which is not useful). Nodes are colored by the library they belong to, and scaled with the number of references pointing them


File: /pandas/io/pytables.py#488
type: <class 'type'>
Commit: