0

I am analysing a huge number of files to strip out the important statistical informations. The analysis-program creates for every analysed file approx 3000 double-arrays of length n (approx. 100) together with a string which names the content of the respective array. I want to write the results into an hdf 5 file, where each array is written into a table whose name is the respective string. For that i use the following function :

#include "hdf5.h"
#include "hdf5_hl.h"
 hid_t       file_id;
 hsize_t     dims[RANK]={1,n};
 herr_t      status;

....

void hdf5_write ( double& array , string arrayname )
{     
 const char * tablename = arrayname.c_str();
 status = H5LTmake_dataset(file_id,tablename,RANK,dims,H5T_NATIVE_DOUBLE,array);
}

This works fine for analysing the first file, however, when analysing multiple files one after another the existing tables are simply overwritten by the new arrays though I want that the new arrays are appended to the already existing tables respectively. Is there a hdf 5 function for that case?

benito_h
  • 460
  • 5
  • 16

1 Answers1

0

I'm afraid you can't append using the high level (H5LT) interface.

Here is a complete example using the low level interface. It is much more complex but it gives you total control.

Or if you think this is overkill, you can ask yourself if you really need a single large dataset vs multiple small ones. Depending on the application you have in mind, multiple datasets might simply be a better design.

Community
  • 1
  • 1
Simon
  • 31,675
  • 9
  • 80
  • 92