0

My objective to convert is to convert a hdf5 (size ~ 2.5 GB) file into csv file. I thought I can do it with the python.

I tried this code:

import pandas


df_test = pandas.read_hdf("event.h5" ,start=1100000,stop=1150000)

But I got an error stating "MemoryError: Unable to allocate 17.2 GiB for an array with shape (2101, 1100000)" . Can any provide me alternate suggestion how can I read hdf5 file and convert into csv ?

Martin Gergov
  • 1,556
  • 4
  • 20
  • 29
  • If you want to use Python, there are 2 other modules you can use to open and read HDF5 files. They are **h5py** or **PyTables** (aka tables). If you have a 17.2 GB HDF5 file, why would you want to export to csv? Likely that format will take even more disk space. – kcw78 Feb 27 '20 at 22:40
  • What do/don’t you understand from that error message? Have you done anything to try to resolve this? – AMC Feb 27 '20 at 23:52
  • I am not sure if you have resolved your issue. The following link might be helpful https://stackoverflow.com/questions/23758893/converting-hdf5-to-csv-or-tsv-files , or https://github.com/h5py/h5py/issues/636 – caot Apr 12 '20 at 22:55

0 Answers0