Package hdf.object.h4

Class H4SDS

All Implemented Interfaces:
DataFormat, MetaDataContainer, Serializable

public class H4SDS extends ScalarDS implements MetaDataContainer
H4SDS describes HDF4 Scientific Data Sets (SDS) and operations performed on the SDS. A SDS is a group of data structures used to store and describe multidimensional arrays of scientific data. The data contained in an SDS array has a data type associated with it. The standard data types supported by the SD interface include 32- and 64-bit floating-point numbers, 8-, 16- and 32-bit signed integers, 8-, 16- and 32-bit unsigned integers, and 8-bit characters. How to Select a Subset Dataset defines APIs for reading, writing and subsetting a dataset. No function is defined to select a subset of a data array. The selection is done in an implicit way. Function calls to dimension information such as getSelectedDims() return an array of dimension values, which is a reference to the array in the dataset object. Changes of the array outside the dataset object directly change the values of the array in the dataset object. It is like pointers in C. The following is an example of how to make a subset. In the example, the dataset is a 4-dimension with size of [200][100][50][10], i.e. dims[0]=200; dims[1]=100; dims[2]=50; dims[3]=10;
We want to select every other data point in dims[1] and dims[2]
     int rank = dataset.getRank();   // number of dimensions of the dataset
     long[] dims = dataset.getDims(); // the dimension sizes of the dataset
     long[] selected = dataset.getSelectedDims(); // the selected size of the dataet
     long[] start = dataset.getStartDims(); // the offset of the selection
     long[] stride = dataset.getStride(); // the stride of the dataset
     int[]  selectedIndex = dataset.getSelectedIndex(); // the selected dimensions for display

     // select dim1 and dim2 as 2D data for display,and slice through dim0
     selectedIndex[0] = 1;
     selectedIndex[1] = 2;
     selectedIndex[1] = 0;

     // reset the selection arrays
     for (int i=0; i<rank; i++) {
         start[i] = 0;
         selected[i] = 1;
         stride[i] = 1;
    }

    // set stride to 2 on dim1 and dim2 so that every other data point is selected.
    stride[1] = 2;
    stride[2] = 2;

    // set the selection size of dim1 and dim2
    selected[1] = dims[1]/stride[1];
    selected[2] = dims[1]/stride[2];

    // when dataset.read() is called, the slection above will be used since
    // the dimension arrays is passed by reference. Changes of these arrays
    // outside the dataset object directly change the values of these array
    // in the dataset object.

 
Version:
1.1 9/4/2007
Author:
Peter X. Cao
See Also:
  • Field Details

    • DFTAG_NDG_NETCDF

      public static final int DFTAG_NDG_NETCDF
      tag for netCDF datasets. HDF4 library supports netCDF version 2.3.2. It only supports SDS APIs.
      See Also:
  • Constructor Details

    • H4SDS

      public H4SDS(FileFormat theFile, String name, String path)
      Creates an H4SDS object with specific name and path.
      Parameters:
      theFile - the HDF file.
      name - the name of this H4SDS.
      path - the full path of this H4SDS.
    • H4SDS

      public H4SDS(FileFormat theFile, String name, String path, long[] oid)
      Creates an H4SDS object with specific name, path and oid.
      Parameters:
      theFile - the HDF file.
      name - the name of this H4SDS.
      path - the full path of this H4SDS.
      oid - the unique identifier of this data object.
  • Method Details

    • hasAttribute

      public boolean hasAttribute()
      Description copied from interface: MetaDataContainer
      Check if the object has any attributes attached.
      Specified by:
      hasAttribute in interface MetaDataContainer
      Returns:
      true if it has any attributes, false otherwise.
    • getDatatype

      Returns the datatype of the data object.
      Specified by:
      getDatatype in interface DataFormat
      Overrides:
      getDatatype in class Dataset
      Returns:
      the datatype of the data object.
    • copy

      public Dataset copy(Group pgroup, String dname, long[] dims, Object buff) throws Exception
      Description copied from class: Dataset
      Creates a new dataset and writes the data buffer to the new dataset. This function allows applications to create a new dataset for a given data buffer. For example, users can select a specific interesting part from a large image and create a new image with the selection. The new dataset retains the datatype and dataset creation properties of this dataset.
      Specified by:
      copy in class Dataset
      Parameters:
      pgroup - the group which the dataset is copied to.
      dname - the name of the new dataset.
      dims - the dimension sizes of the the new dataset.
      buff - the data values of the subset to be copied.
      Returns:
      the new dataset.
      Throws:
      Exception - if dataset can not be copied
    • readBytes

      public byte[] readBytes() throws hdf.hdflib.HDFException
      Description copied from class: Dataset
      Reads the raw data of the dataset from file to a byte array. readBytes() reads raw data to an array of bytes instead of array of its datatype. For example, for a one-dimension 32-bit integer dataset of size 5, readBytes() returns a byte array of size 20 instead of an int array of 5. readBytes() can be used to copy data from one dataset to another efficiently because the raw data is not converted to its native type, it saves memory space and CPU time.
      Specified by:
      readBytes in class Dataset
      Returns:
      the byte array of the raw data.
      Throws:
      hdf.hdflib.HDFException
    • read

      public Object read() throws hdf.hdflib.HDFException, OutOfMemoryError
      Reads the data from file. read() reads the data from file to a memory buffer and returns the memory buffer. The dataset object does not hold the memory buffer. To store the memory buffer in the dataset object, one must call getData(). By default, the whole dataset is read into memory. Users can also select a subset to read. Subsetting is done in an implicit way.
      Specified by:
      read in interface DataFormat
      Returns:
      the data read from file.
      Throws:
      hdf.hdflib.HDFException - if object can not be read
      OutOfMemoryError - if memory is exhausted
      See Also:
    • write

      public void write(Object buf) throws hdf.hdflib.HDFException
      Writes a memory buffer to the object in the file.
      Specified by:
      write in interface DataFormat
      Parameters:
      buf - the data to write
      Throws:
      hdf.hdflib.HDFException - if data can not be written
    • getMetadata

      public List getMetadata() throws hdf.hdflib.HDFException
      Retrieves the object's metadata, such as attributes, from the file. Metadata, such as attributes, is stored in a List.
      Specified by:
      getMetadata in interface MetaDataContainer
      Returns:
      the list of metadata objects.
      Throws:
      hdf.hdflib.HDFException - if the metadata can not be retrieved
    • writeMetadata

      public void writeMetadata(Object info) throws Exception
      Writes a specific piece of metadata (such as an attribute) into the file. If an HDF(4&5) attribute exists in the file, this method updates its value. If the attribute does not exist in the file, it creates the attribute in the file and attaches it to the object. It will fail to write a new attribute to the object where an attribute with the same name already exists. To update the value of an existing attribute in the file, one needs to get the instance of the attribute by getMetadata(), change its values, then use writeMetadata() to write the value.
      Specified by:
      writeMetadata in interface MetaDataContainer
      Parameters:
      info - the metadata to write.
      Throws:
      Exception - if the metadata can not be written
    • removeMetadata

      public void removeMetadata(Object info) throws hdf.hdflib.HDFException
      Deletes an existing piece of metadata from this object.
      Specified by:
      removeMetadata in interface MetaDataContainer
      Parameters:
      info - the metadata to delete.
      Throws:
      hdf.hdflib.HDFException - if the metadata can not be removed
    • updateMetadata

      public void updateMetadata(Object info) throws Exception
      Updates an existing piece of metadata attached to this object.
      Specified by:
      updateMetadata in interface MetaDataContainer
      Parameters:
      info - the metadata to update.
      Throws:
      Exception - if the metadata can not be updated
    • open

      public long open()
      Description copied from class: HObject
      Opens an existing object such as a dataset or group for access. The return value is an object identifier obtained by implementing classes such as H5.H5Dopen(). This function is needed to allow other objects to be able to access the object. For instance, H5File class uses the open() function to obtain object identifier for copyAttributes(long src_id, long dst_id) and other purposes. The open() function should be used in pair with close(long) function.
      Specified by:
      open in class HObject
      Returns:
      the object identifier if successful; otherwise returns a negative value.
      See Also:
    • close

      public void close(long id)
      Description copied from class: HObject
      Closes access to the object. Sub-classes must implement this interface because different data objects have their own ways of how the data resources are closed. For example, H5Group.close() calls the hdf.hdf5lib.H5.H5Gclose() method and closes the group resource specified by the group id.
      Specified by:
      close in class HObject
      Parameters:
      id - The object identifier.
    • init

      public void init()
      Initializes the H4SDS such as dimension size of this dataset.
      Specified by:
      init in interface DataFormat
    • create

      public static H4SDS create(String name, Group pgroup, Datatype type, long[] dims, long[] maxdims, long[] chunks, int gzip, Object fillValue, Object data) throws Exception
      Creates a new dataset.
      Parameters:
      name - the name of the dataset to create.
      pgroup - the parent group of the new dataset.
      type - the datatype of the dataset.
      dims - the dimension size of the dataset.
      maxdims - the max dimension size of the dataset.
      chunks - the chunk size of the dataset.
      gzip - the level of the gzip compression.
      fillValue - the default value.
      data - the array of data values.
      Returns:
      the new dataset if successful. Otherwise returns null.
      Throws:
      Exception - if the dataset can not be created
    • create

      public static H4SDS create(String name, Group pgroup, Datatype type, long[] dims, long[] maxdims, long[] chunks, int gzip, Object data) throws Exception
      Creates a new dataset.
      Parameters:
      name - the name of the dataset to create.
      pgroup - the parent group of the new dataset.
      type - the datatype of the dataset.
      dims - the dimension size of the dataset.
      maxdims - the max dimension size of the dataset.
      chunks - the chunk size of the dataset.
      gzip - the level of the gzip compression.
      data - the array of data values.
      Returns:
      the new dataset if successful. Otherwise returns null.
      Throws:
      Exception - if the dataset can not be created
    • getMetadata

      public List getMetadata(int... attrPropList) throws Exception
      Retrieves the object's metadata, such as attributes, from the file. Metadata, such as attributes, is stored in a List.
      Parameters:
      attrPropList - the list of properties to get
      Returns:
      the list of metadata objects.
      Throws:
      Exception - if the metadata can not be retrieved