- All Implemented Interfaces:
DataFormat
,MetaDataContainer
,Serializable
public class H4SDS extends ScalarDS
The data contained in an SDS array has a data type associated with it. The standard data types supported by the SD interface include 32- and 64-bit floating-point numbers, 8-, 16- and 32-bit signed integers, 8-, 16- and 32-bit unsigned integers, and 8-bit characters.
How to Select a Subset
Dataset defines APIs for reading, writing and subsetting a dataset. No function is defined to select a subset of a data array. The selection is done in an implicit way. Function calls to dimension information such as getSelectedDims() return an array of dimension values, which is a reference to the array in the dataset object. Changes of the array outside the dataset object directly change the values of the array in the dataset object. It is like pointers in C.
The following is an example of how to make a subset. In the example, the dataset
is a 4-dimension with size of [200][100][50][10], i.e.
dims[0]=200; dims[1]=100; dims[2]=50; dims[3]=10;
We want to select every other data point in dims[1] and dims[2]
int rank = dataset.getRank(); // number of dimensions of the dataset long[] dims = dataset.getDims(); // the dimension sizes of the dataset long[] selected = dataset.getSelectedDims(); // the selected size of the dataet long[] start = dataset.getStartDims(); // the offset of the selection long[] stride = dataset.getStride(); // the stride of the dataset int[] selectedIndex = dataset.getSelectedIndex(); // the selected dimensions for display // select dim1 and dim2 as 2D data for display,and slice through dim0 selectedIndex[0] = 1; selectedIndex[1] = 2; selectedIndex[1] = 0; // reset the selection arrays for (int i=0; i<rank; i++) { start[i] = 0; selected[i] = 1; stride[i] = 1; } // set stride to 2 on dim1 and dim2 so that every other data point is selected. stride[1] = 2; stride[2] = 2; // set the selection size of dim1 and dim2 selected[1] = dims[1]/stride[1]; selected[2] = dims[1]/stride[2]; // when dataset.read() is called, the slection above will be used since // the dimension arrays is passed by reference. Changes of these arrays // outside the dataset object directly change the values of these array // in the dataset object.
- Version:
- 1.1 9/4/2007
- Author:
- Peter X. Cao
- See Also:
- Serialized Form
-
Field Summary
Fields Modifier and Type Field Description static int
DFTAG_NDG_NETCDF
tag for netCDF datasets.Fields inherited from class hdf.object.ScalarDS
fillValue, imageDataRange, interlace, INTERLACE_LINE, INTERLACE_PIXEL, INTERLACE_PLANE, isDefaultImageOrder, isFillValueConverted, isImage, isImageDisplay, isText, isTrueColor, palette, unsignedConverted
Fields inherited from class hdf.object.Dataset
chunkSize, compression, COMPRESSION_GZIP_TXT, convertByteToString, convertedBuf, data, datatype, dimNames, dims, filters, inited, isDataLoaded, maxDims, nPoints, originalBuf, rank, selectedDims, selectedIndex, selectedStride, startDims, storage, storageLayout
Fields inherited from class hdf.object.HObject
fileFormat, linkTargetObjName, oid, SEPARATOR
-
Constructor Summary
Constructors Constructor Description H4SDS(FileFormat theFile, String name, String path)
H4SDS(FileFormat theFile, String name, String path, long[] oid)
Creates an H4SDS object with specific name and path. -
Method Summary
Modifier and Type Method Description void
close(long id)
Closes access to the object.Dataset
copy(Group pgroup, String dname, long[] dims, Object buff)
Creates a new dataset and writes the data buffer to the new dataset.static H4SDS
create(String name, Group pgroup, Datatype type, long[] dims, long[] maxdims, long[] chunks, int gzip, Object data)
static H4SDS
create(String name, Group pgroup, Datatype type, long[] dims, long[] maxdims, long[] chunks, int gzip, Object fillValue, Object data)
Creates a new dataset.Datatype
getDatatype()
Returns the datatype of the data object.List
getMetadata()
Retrieves the object's metadata, such as attributes, from the file.List
getMetadata(int... attrPropList)
byte[][]
getPalette()
Returns the palette of this scalar dataset or null if palette does not exist.byte[]
getPaletteRefs()
Returns the byte array of palette refs.boolean
hasAttribute()
Check if the object has any attributes attached.void
init()
Initializes the H4SDS such as dimension size of this dataset.long
open()
Opens an existing object such as a dataset or group for access.Object
read()
Reads the data from file.byte[]
readBytes()
Reads the raw data of the dataset from file to a byte array.byte[][]
readPalette(int idx)
Reads a specific image palette from file.void
removeMetadata(Object info)
Deletes an existing piece of metadata from this object.void
updateMetadata(Object info)
Updates an existing piece of metadata attached to this object.void
write(Object buf)
Writes a memory buffer to the object in the file.void
writeMetadata(Object info)
Writes a specific piece of metadata (such as an attribute) into the file.Methods inherited from class hdf.object.ScalarDS
addFilteredImageValue, clearData, convertFromUnsignedC, convertToUnsignedC, getFillValue, getFilteredImageValues, getImageDataRange, getInterlace, getPaletteName, isDefaultImageOrder, isImage, isImageDisplay, isTrueColor, setImageDataRange, setIsImage, setIsImageDisplay, setPalette
Methods inherited from class hdf.object.Dataset
byteToString, clear, convertFromUnsignedC, convertFromUnsignedC, convertToUnsignedC, convertToUnsignedC, getChunkSize, getCompression, getConvertByteToString, getData, getDimNames, getDims, getFilters, getHeight, getMaxDims, getOriginalClass, getRank, getSelectedDims, getSelectedIndex, getSize, getStartDims, getStorage, getStorageLayout, getStride, getVirtualFilename, getVirtualMaps, getWidth, isInited, isString, isVirtual, setConvertByteToString, setData, stringToByte, write
Methods inherited from class hdf.object.HObject
createFullname, debug, equals, equals, equalsOID, getFID, getFile, getFileFormat, getFullName, getLinkTargetObjName, getName, getOID, getPath, hashCode, setFullname, setLinkTargetObjName, setName, setPath, toString
Methods inherited from class java.lang.Object
clone, finalize, getClass, notify, notifyAll, wait, wait, wait
-
Field Details
-
DFTAG_NDG_NETCDF
tag for netCDF datasets. HDF4 library supports netCDF version 2.3.2. It only supports SDS APIs.- See Also:
- Constant Field Values
-
-
Constructor Details
-
H4SDS
-
H4SDS
Creates an H4SDS object with specific name and path.- Parameters:
theFile
- the HDF file.name
- the name of this H4SDS.path
- the full path of this H4SDS.oid
- the unique identifier of this data object.
-
-
Method Details
-
hasAttribute
Description copied from interface:MetaDataContainer
Check if the object has any attributes attached.- Returns:
- true if it has any attributes, false otherwise.
-
readPalette
Description copied from class:ScalarDS
Reads a specific image palette from file.A scalar dataset may have multiple palettes attached to it. readPalette(int idx) returns a specific palette identified by its index.
- Specified by:
readPalette
in classScalarDS
- Parameters:
idx
- the index of the palette to read.- Returns:
- the image palette
-
getPaletteRefs
Description copied from class:ScalarDS
Returns the byte array of palette refs.A palette reference is an object reference that points to the palette dataset.
For example, Dataset "Iceberg" has an attribute of object reference "Palette". The arrtibute "Palette" has value "2538" that is the object reference of the palette data set "Iceberg Palette".
- Specified by:
getPaletteRefs
in classScalarDS
- Returns:
- null if there is no palette attribute attached to this dataset.
-
getDatatype
Description copied from interface:DataFormat
Returns the datatype of the data object.- Specified by:
getDatatype
in interfaceDataFormat
- Overrides:
getDatatype
in classDataset
- Returns:
- the datatype of the data object.
-
copy
Description copied from class:Dataset
Creates a new dataset and writes the data buffer to the new dataset.This function allows applications to create a new dataset for a given data buffer. For example, users can select a specific interesting part from a large image and create a new image with the selection.
The new dataset retains the datatype and dataset creation properties of this dataset.
- Specified by:
copy
in classDataset
- Parameters:
pgroup
- the group which the dataset is copied to.dname
- the name of the new dataset.dims
- the dimension sizes of the the new dataset.buff
- the data values of the subset to be copied.- Returns:
- the new dataset.
- Throws:
Exception
- if dataset can not be copied
-
readBytes
Description copied from class:Dataset
Reads the raw data of the dataset from file to a byte array.readBytes() reads raw data to an array of bytes instead of array of its datatype. For example, for a one-dimension 32-bit integer dataset of size 5, readBytes() returns a byte array of size 20 instead of an int array of 5.
readBytes() can be used to copy data from one dataset to another efficiently because the raw data is not converted to its native type, it saves memory space and CPU time.
-
read
Description copied from interface:DataFormat
Reads the data from file.read() reads the data from file to a memory buffer and returns the memory buffer. The dataset object does not hold the memory buffer. To store the memory buffer in the dataset object, one must call getData().
By default, the whole dataset is read into memory. Users can also select a subset to read. Subsetting is done in an implicit way.
- Returns:
- the data read from file.
- Throws:
OutOfMemoryError
- if memory is exhaustedhdf.hdflib.HDFException
- See Also:
DataFormat.getData()
-
write
Description copied from interface:DataFormat
Writes a memory buffer to the object in the file.- Parameters:
buf
- the data to write- Throws:
hdf.hdflib.HDFException
-
getMetadata
Description copied from interface:MetaDataContainer
Retrieves the object's metadata, such as attributes, from the file.Metadata, such as attributes, is stored in a List.
- Returns:
- the list of metadata objects.
- Throws:
hdf.hdflib.HDFException
-
writeMetadata
Description copied from interface:MetaDataContainer
Writes a specific piece of metadata (such as an attribute) into the file. If an HDF(4&5) attribute exists in the file, this method updates its value. If the attribute does not exist in the file, it creates the attribute in the file and attaches it to the object. It will fail to write a new attribute to the object where an attribute with the same name already exists. To update the value of an existing attribute in the file, one needs to get the instance of the attribute by getMetadata(), change its values, then use writeMetadata() to write the value.- Parameters:
info
- the metadata to write.- Throws:
Exception
- if the metadata can not be written
-
removeMetadata
Description copied from interface:MetaDataContainer
Deletes an existing piece of metadata from this object.- Parameters:
info
- the metadata to delete.- Throws:
hdf.hdflib.HDFException
-
updateMetadata
Description copied from interface:MetaDataContainer
Updates an existing piece of metadata attached to this object.- Parameters:
info
- the metadata to update.- Throws:
Exception
- if the metadata can not be updated
-
open
Description copied from class:HObject
Opens an existing object such as a dataset or group for access. The return value is an object identifier obtained by implementing classes such as H5.H5Dopen(). This function is needed to allow other objects to be able to access the object. For instance, H5File class uses the open() function to obtain object identifier for copyAttributes(long src_id, long dst_id) and other purposes. The open() function should be used in pair with close(long) function.- Specified by:
open
in classHObject
- Returns:
- the object identifier if successful; otherwise returns a negative value.
- See Also:
HObject.close(long)
-
close
Description copied from class:HObject
Closes access to the object.Sub-classes must implement this interface because different data objects have their own ways of how the data resources are closed.
For example, H5Group.close() calls the hdf.hdf5lib.H5.H5Gclose() method and closes the group resource specified by the group id.
-
init
Initializes the H4SDS such as dimension size of this dataset. -
getPalette
Description copied from class:ScalarDS
Returns the palette of this scalar dataset or null if palette does not exist.A Scalar dataset can be displayed as spreadsheet data or an image. When a scalar dataset is displayed as an image, the palette or color table may be needed to translate a pixel value to color components (for example, red, green, and blue). Some scalar datasets have no palette and some datasets have one or more than one palettes. If an associated palette exists but is not loaded, this interface retrieves the palette from the file and returns the palette. If the palette is loaded, it returns the palette. It returns null if there is no palette associated with the dataset.
Current implementation only supports palette model of indexed RGB with 256 colors. Other models such as YUV", "CMY", "CMYK", "YCbCr", "HSV will be supported in the future.
The palette values are stored in a two-dimensional byte array and are arranges by color components of red, green and blue. palette[][] = byte[3][256], where, palette[0][], palette[1][] and palette[2][] are the red, green and blue components respectively.
Sub-classes have to implement this interface. HDF4 and HDF5 images use different libraries to retrieve the associated palette.
- Specified by:
getPalette
in classScalarDS
- Returns:
- the 2D palette byte array.
-
create
public static H4SDS create(String name, Group pgroup, Datatype type, long[] dims, long[] maxdims, long[] chunks, int gzip, Object fillValue, Object data) throws ExceptionCreates a new dataset.- Parameters:
name
- the name of the dataset to create.pgroup
- the parent group of the new dataset.type
- the datatype of the dataset.dims
- the dimension size of the dataset.maxdims
- the max dimension size of the dataset.chunks
- the chunk size of the dataset.gzip
- the level of the gzip compression.fillValue
- the default value.data
- the array of data values.- Returns:
- the new dataset if successful. Otherwise returns null.
- Throws:
Exception
- if the dataset can not be created
-
create
public static H4SDS create(String name, Group pgroup, Datatype type, long[] dims, long[] maxdims, long[] chunks, int gzip, Object data) throws Exception- Throws:
Exception
-
getMetadata
- Throws:
Exception
-