I am analysing a huge number of files to strip out the important statistical informations. The analysis-program creates for every analysed file approx 3000 double-arrays of length n (approx. 100) together with a string which names the content of the respective array. I want to write the results into an hdf 5 file, where each array is written into a table whose name is the respective string. For that i use the following function :
#include "hdf5.h"
#include "hdf5_hl.h"
hid_t file_id;
hsize_t dims[RANK]={1,n};
herr_t status;
....
void hdf5_write ( double& array , string arrayname )
{
const char * tablename = arrayname.c_str();
status = H5LTmake_dataset(file_id,tablename,RANK,dims,H5T_NATIVE_DOUBLE,array);
}
This works fine for analysing the first file, however, when analysing multiple files one after another the existing tables are simply overwritten by the new arrays though I want that the new arrays are appended to the already existing tables respectively. Is there a hdf 5 function for that case?