2

I am writing an HDF5 file using the C++ HDF api and performing a few comparisons against the H5py Python library.

In the H5py Python library autochunking is applied by default when a compression algorithm such as GZIP or LZF is used.

Does the same condition apply to the HDF5 C++ api? If so, how can I prove that chunks were automatically created when a compression algorithm, such as GZIP, was applied to the data sets.

zd5151
  • 395
  • 1
  • 3
  • 15

1 Answers1

1

According to https://www.hdfgroup.org/HDF5/faq/compression.html:

To apply a compression filter to a dataset it must be created with a chunked dataset storage layout.

And you have to do it manually:

1. Create a dataset creation property list and modify it to enable chunking and compression.
2. Create the dataset.
Heitor
  • 339
  • 1
  • 3
  • 11