I am processing number of files, each processing of the file will output several thousand of arrays of float and I will store the data of all files in one huge dataset in a single hdf5 for further processing.
The thing is currently I am confused about how to append my data into the hdf5 file. (comment in the code above) In 2 for loops above, as you can see, I want to append 1 dimensional array of float into hdf5 at a time, and not as the whole thing. My data is in terabytes, and we can only append the data into the file.
There are several questions:
- How to append the data in this case? What kind of function must I use?
- Right now, I have fdim[0] = 928347543, I have tried put infinity flag of HDF5 in, but the runtime execution complains. Is there a way to do this? I don't want to calculate the data that I have each time; is there a way to just simply keep on adding data in, without caring the value of fdim?
Or is this not possible?
EDIT:
I've been following Simon's suggestion, and currently here is my updated code:
hid_t desFi5;
hid_t fid1;
hid_t propList;
hsize_t fdim[2];
desFi5 = H5Fcreate(saveFilePath, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);
fdim[0] = 3;
fdim[1] = 1;//H5S_UNLIMITED;
fid1 = H5Screate_simple(2, fdim, NULL);
cout << "----------------------------------Space done\n";
propList = H5Pcreate( H5P_DATASET_CREATE);
H5Pset_layout( propList, H5D_CHUNKED );
int ndims = 2;
hsize_t chunk_dims[2];
chunk_dims[0] = 3;
chunk_dims[1] = 1;
H5Pset_chunk( propList, ndims, chunk_dims );
cout << "----------------------------------Property done\n";
hid_t dataset1 = H5Dcreate( desFi5, "des", H5T_NATIVE_FLOAT, fid1, H5P_DEFAULT, propList, H5P_DEFAULT);
cout << "----------------------------------Dataset done\n";
bufi = new float*[1];
bufi[0] = new float[3];
bufi[0][0] = 0;
bufi[0][1] = 1;
bufi[0][2] = 2;
//hyperslab
hsize_t start[2] = {0,0};
hsize_t stride[2] = {1,1};
hsize_t count[2] = {1,1};
hsize_t block[2] = {1,3};
H5Sselect_hyperslab( fid1, H5S_SELECT_OR, start, stride, count, block);
cout << "----------------------------------hyperslab done\n";
H5Dwrite(dataset1, H5T_NATIVE_FLOAT, H5S_ALL, H5S_ALL, H5P_DEFAULT, *bufi);
fdim[0] = 3;
fdim[1] = H5S_UNLIMITED; // COMPLAINS HERE
H5Dset_extent( dataset1, fdim );
cout << "----------------------------------extent done\n";
//hyperslab2
hsize_t start2[2] = {1,0};
hsize_t stride2[2] = {1,1};
hsize_t count2[2] = {1,1};
hsize_t block2[2] = {1,3};
H5Sselect_hyperslab( fid1, H5S_SELECT_OR, start2, stride2, count2, block2);
cout << "----------------------------------hyperslab2 done\n";
H5Dwrite(dataset1, H5T_NATIVE_FLOAT, H5S_ALL, H5S_ALL, H5P_DEFAULT, *bufi);
cout << "----------------------------------H5Dwrite done\n";
H5Dclose(dataset1);
cout << "----------------------------------dataset closed\n";
H5Pclose( propList );
cout << "----------------------------------property list closed\n";
H5Sclose(fid1);
cout << "----------------------------------dataspace fid1 closed\n";
H5Fclose(desFi5);
cout << "----------------------------------desFi5 closed\n";
My current output is:
bash-3.2$ ./hdf5AppendTest.out
----------------------------------Space done
----------------------------------Property done
----------------------------------Dataset done
----------------------------------hyperslab done
HDF5-DIAG: Error detected in HDF5 (1.8.10) thread 0:
#000: /home/hdftest/snapshots-bin-hdf5_1_8_10/current/src/H5D.c line 1103 in H5Dset_extent(): unable to set extend dataset
major: Dataset
minor: Unable to initialize object
#001: /home/hdftest/snapshots-bin-hdf5_1_8_10/current/src/H5Dint.c line 2179 in H5D__set_extent(): unable to modify size of data space
major: Dataset
minor: Unable to initialize object
#002: /home/hdftest/snapshots-bin-hdf5_1_8_10/current/src/H5S.c line 1874 in H5S_set_extent(): dimension cannot exceed the existing maximal size (new: 18446744073709551615 max: 1)
major: Dataspace
minor: Bad value
----------------------------------extent done
----------------------------------hyperslab2 done
----------------------------------H5Dwrite done
----------------------------------dataset closed
----------------------------------property list closed
----------------------------------dataspace fid1 closed
----------------------------------desFi5 closed
Currently, I see that setting things in unlimited with H5Dset_extent still causes a problem during runtime. (problematic function is marked with //COMPLAINS HERE
in the code above.) I already got a chunk data as specified by Simon, so what's the problem here?
On the other hand, without H5Dset_extent, I can write a test array of [0, 1, 2] just fine, but how can we make the code above the output the test array to the file like this:
[0, 1, 2]
[0, 1, 2]
[0, 1, 2]
[0, 1, 2]
...
...
Recall: this is just a test array, the real data is bigger, and I cannot hold the whole thing in the RAM, so I must put data in part by part one at a time.
EDIT 2:
I've followed more of Simon's suggestion. Here is the critical part:
hsize_t n = 3, p = 1;
float *bufi_data = new float[n * p];
float ** bufi = new float*[n];
for (hsize_t i = 0; i < n; ++i){
bufi[i] = &bufi_data[i * n];
}
bufi[0][0] = 0.1;
bufi[0][1] = 0.2;
bufi[0][2] = 0.3;
//hyperslab
hsize_t start[2] = {0,0};
hsize_t count[2] = {3,1};
H5Sselect_hyperslab( fid1, H5S_SELECT_SET, start, NULL, count, NULL);
cout << "----------------------------------hyperslab done\n";
H5Dwrite(dataset1, H5T_NATIVE_FLOAT, H5S_ALL, fid1, H5P_DEFAULT, *bufi);
bufi[0][0] = 0.4;
bufi[0][1] = 0.5;
bufi[0][2] = 0.6;
hsize_t fdimNew[2];
fdimNew[0] = 3;
fdimNew[1] = 2;
H5Dset_extent( dataset1, fdimNew );
cout << "----------------------------------extent done\n";
//hyperslab2
hsize_t start2[2] = {0,0}; //PROBLEM
hsize_t count2[2] = {3,1};
H5Sselect_hyperslab( fid1, H5S_SELECT_SET, start2, NULL, count2, NULL);
cout << "----------------------------------hyperslab2 done\n";
H5Dwrite(dataset1, H5T_NATIVE_FLOAT, H5S_ALL, fid1, H5P_DEFAULT, *bufi);
From the above, I got the following output for hdf5:
0.4 0.5 0.6
0 0 0
After further experiment with start2
and count2
, I see these variables only affect starting index and incrementing index for bufi
. It does not move the position of the writing index of my dataset at all.
Recall: the final result must be:
0.1 0.2 0.3
0.4 0.5 0.6
Also, it must be bufi
instead of *bufi
for H5Dwrite
, Simon, because bufi
gives me completely random numbers.
UPDATE 3:
For the selection part suggested by Simon:
hsize_t start[2] = {0, 0};
hsize_t count[2] = {1, 3};
hsize_t start[2] = {1, 0};
hsize_t count[2] = {1, 3};
These will give out the following error:
HDF5-DIAG: Error detected in HDF5 (1.8.10) thread 0:
#000: /home/hdftest/snapshots-bin-hdf5_1_8_10/current/src/H5Dio.c line 245 in H5Dwrite(): file selection+offset not within extent
major: Dataspace
minor: Out of range
count[2]
should be {3,1}
, rather than {1,3}
, I suppose? And for start[2]
, if I don't set it as {0,0}
, it will always yell out the error above.
Are you sure this is correct?