One of the easiest ways to solve this would be to abstract oneself from the low-level details of HDF5 files in Java. You can do this
by using HDFql (Hierarchical Data Format query language - http://www.hdfql.com).
Assuming that your HDF5 file is named my_file.h5
and it stores a dataset named my_dataset
(dimensions 264x264x1024 of uint32), you
could read this into a Java array using HDFql as follows:
// import HDFql package (make sure it can be found by the Java compiler/JVM)
import as.hdfql.*;
public class Example
{
public static void main(String args[])
{
int data[][][] = new int[264][264][1024];
int x;
int y;
int z;
// select (i.e. read) a dataset name "my_dataset" from HDF5 file "my_file.h5" and store it in variable "data"
HDFql.execute("SELECT FROM my_file.h5 my_dataset INTO MEMORY " + HDFql.variableTransientRegister(data));
// display content of variable "data"
for(x = 0; x < 264; x++)
{
for(y = 0; y < 264; y++)
{
for(z = 0; z < 1024; z++)
{
System.out.println(data[x][y][z]);
}
}
}
}
}
This example was successfully run using Java 8 and HDFql version 2.1.0. Additionally, please keep in mind that:
Java does not support unsigned datatypes such as uint32
(you will have to somehow make the conversion by yourself);
You may run into an OutOfMemoryError
exception if there is not enough heap space for Java to store an array with these dimensions
(to solve this, you will have to play with the parameter -Xmx
when launching Java).