hdf images hdf images

This web site is no longer maintained (but will remain online).
Please see The HDF Group's new Support Portal for the latest information.

Using Compound Data Types

HDF-5 Compound Datatypes are records of heterogeneous elements (possibly including arrays and other compound data types). These are essentially like C 'structs', and are stored and retrieved as packed arrays of bytes. The Java language cannot directly represent such structured records, but it can store and retrieve HDF-5 compound datatypes in two ways: as arrays of bytes (which must be interpreted) and by reading individual fields of the records.

Using Compound Datatypes as Bytes

An array of compound datatypes can be defined, written, and read by copying the data to and from appropriate sized arrays of bytes. It is up to the Java application to assure that the data is copied correctly.

For instance, an array of Java objects representing complex numbers--each complex number has two Double values, the real and imaginary components. An array of complex numbers can be stored as an HDF-5 Compound Datatype, with two fields of 8 bytes each. Each record is 16 bytes long, and the data can be written as an array of bytes.

byte[] data = new bytes[100 * 16]; // An array of 100 'complex' numbers
// ... fill in the array with complex numbers: this is up to
// the Java program to figure out how to do this...
int dataspace = -1;
try {
dataspace = h5.H5Screate_simple( rankf, dimsc, null );
// create a compound data type with two fields of type double
// complex {
// double real; // bytes 0-7
// double imaginary; // bytes 8-15
// }
 int datatype = -1;
 datatype = H5.H5Tcreate( HDF5Constants.H5T_COMPOUND, 16 );
H5.H5Tinsert( datatype, "real", 0,
 HDF5Constants.HT_NATIVE_DOUBLE );
H5.H5Tinsert( datatype, "imaginary", 8,
 HDF5Constants.HT_NATIVE_DOUBLE );
 // create the dataset with this dataspace and compound
 // datatype
 int dataset = -1;
hH.H5Dcreate( file, datasetName,
 datatype, dataspace, HDF5Constants.H5P_DEFAULT );
H5.H5Dwrite( dataset, datatype, dataspace,
 HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT,
 bytes );
} catch ( HDFException ex ) {
 exit(1);
}
In this example, an HDF-5 Compound Datatype is declared, and an array of 100 records is created. The array is written from the Java array of bytes. Assuming that the array 'bytes' is correctly initialized, this will create the correct data in the HDF-5 file.

The programmer should be aware that the proper layout of the bytes depends on the platform and C compiler. It is up to the program to make sure it constructs the correct records.

Using Compound Datatypes by Elements

A second alternative for accessing Compound Datatypes is to access each field individually. This results in a read of an array of a single data type. In the example above, the dataset could be read by reading the 'real' component into an array, and then reading the 'imaginary' component into a second array. The following example illustrates this.



// An array of 100 'real' components
Double[] realValues = new Double[100];
// An array of 100 'imag' components
Double[] imaginaryValues = new Double[100];
int dataspace = -1;
try {
dataspace = H5.H5Screate_simple( rankf, dimsc, null );
 // create a compound data type to read the 'real' field of type double

 int datatype = -1;
 datatype = H5.H5Tcreate( HDF5Constants.H5T_COMPOUND, 8 );
H5.H5Tinsert( datatype, "real", 0, HDF5Constants.HT_NATIVE_DOUBLE );
 // create the dataset with this dataspace and compound datatype
 int dataset = -1;
H5.H5Dopen( file, datasetName, HDF5Constants.H5P_DEFAULT );
h5.H5Dread( dataset, datatype, dataspace,
 HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT,
 realValues );
// create a compound data type to read the 'real' field of type double

int datatype2 = -1;
datatype2 = H5.H5Tcreate( HDF5Constants.H5T_COMPOUND, 8 );
H5.H5Tinsert( datatype, "imaginary", 0,
 HDF5Constants.HT_NATIVE_DOUBLE );

H5.H5Dread( dataset, datatype, dataspace,
 HDF5Constants.H5S_ALL, HDF5Constants.H5P_DEFAULT,
 imaginaryValues );
} catch ( HDFException ex ) {
 exit(1);
}
In this example, all the records are read field by field. The first component, "real", is read into an array of 100 Doubles, and the second field is read into a second array.

In this approach, the HDF-5 Library assures that the correct data is read as long as the compound Datatype is correct.


- - Last modified: 06 May 2016