Warning: include(../links.php): failed to open stream: No such file or directory in /home/hdfgroup/public_ftp/HDF5/releases/HDF-JAVA/HDF-JAVA-2.6/src/build_src_unix.html on line 2

Warning: include(): Failed opening '../links.php' for inclusion (include_path='.:/usr/lib/php:/usr/local/lib/php') in /home/hdfgroup/public_ftp/HDF5/releases/HDF-JAVA/HDF-JAVA-2.6/src/build_src_unix.html on line 2

Warning: include(../includes/header.html): failed to open stream: No such file or directory in /home/hdfgroup/public_ftp/HDF5/releases/HDF-JAVA/HDF-JAVA-2.6/src/build_src_unix.html on line 3

Warning: include(): Failed opening '../includes/header.html' for inclusion (include_path='.:/usr/lib/php:/usr/local/lib/php') in /home/hdfgroup/public_ftp/HDF5/releases/HDF-JAVA/HDF-JAVA-2.6/src/build_src_unix.html on line 3

Compiling HDF-JAVA Source on Unix

Source Files

After you download and extract the source package, you will have the HDF Java native C code and the Java source code for JNI, HDF-Object package and HDFView. You need to compile the C source and Java source separately. The following instructions tell you how to build the C source in Visual C++ and the Java source from batch file.

HDF Java Product Source
Source directory Description
hdf-java/native/hdflib/ C header files and C source files for HDF4 Java Native Interface
hdf-java/native/hdf5lib/ C header files and C source files for HDF5 Java Native Interface
hdf-java/ncsa/hdf/hdflib/ Java source files for HDF4 Java Native Interface
hdf-java/ncsa/hdf/hdf5lib/ Java source files for HDF5 Java Native Interface
hdf-java/ncsa/hdf/object/ Java source files for HDF-Object package
hdf-java/ncsa/hdf/view/ Java source files for HDFView


You need the following libraries in order to compile the HDF Java products.


The HDF Java products source includes configure and make files to build the Java products. The source code includes the Java code for the HDF native interface to HDF4 and HDF5, the ncsa.hdf.objects package, ncsa.hdf.io packages, and the HDFView packages. HDF4 and HDF5 can be selected or deselected. The source does not include the HDF4 or HDF5 libraries or any dependent libraries.

To build from source requires a C compiler and a Java compiler, JDK1.5.x or above. The autoconf and make files work with Gnu tools and gmake.

Configure script

The first step is to run the configure script. The configure script checks the dependent libraries and other requirements, and generates Makefiles.

The C code for the native interface requires the static HDF libraries and the external filters for HDF.  It is difficult to autodetect these libraries, so the configure requires that the paths be specified. The configure script checks and reports any missing libraries.

The source distribution includes the file "runconfig_example.sh".  This file shows an example for how to set the paths and call configure. (Figure 1). In many cases, this script can be edited to fill in the paths, and then used to configure.


## This is an example script to set the configure parameters for
## the HDF Java products.
## The paths need to be set according to the local configuration
## May need to adjust the arguments to configure

## IMPORTANT NOTE: The make files require 'gmake'
## Be sure to 'setenv MAKE gmake' if necessary

INSTDIR= # FILL IN where to install the hdfview.

JAVAINC= # FILL IN path to java includes (jni.h, etc.)
JAVALIB= # FILL IN path to java lib (the rt.jar, etc.)

HDF5= # path to HDF5 installation, e.g., /usr/local/hdf5-1.6.2
HDF4= # path to HDF4 installation (if used)
HDF45= # path to HDF4 to HDF5 installation (if used)

## Autoconf detects shared libraries, but we need static versions
## must set these paths for external libraries needed for HDF libraries.

# JPEG is required by HDF4. If HDF4 is used, _must_ set JPEG
JPEG= # FILL IN path to JPEG installation (the path to libjpeg.a is needed)

# GZIP is required by HDF4 and optional for HDF5.
# If HDF4 is used, _must_ set GZIP
# If HDF5 is used and zlib is used, _must_ set GZIP
GZIP= # FILL IN path to GZIP installation (the path to libz.a is needed)

# SZIP is optional for HDF4 and HDF5.
# If szip is used in one or both HDF libraries, _must_ set SZIP
SZIP= # FILL IN path to SZIP installation (the path to libsz.a is needed)


./configure --prefix=$INSTDIR --with-jdk=$JAVAINC,$JAVALIB \
--with-hdf5=$HDF5/include,$HDF5/lib \
--with-hdf4=$HDF4/include,$HDF4/lib \
--with-libsz=$SZIP/include,$SZIP/lib \
--with-libz=$GZIP/include,$GZIP/lib \

# other options
# --without-hdf4 -- omit HDF4
# --without-hdf5 -- omit HDF5
# --without-libsz -- omit SZIP
# --without-libz -- omit GZIP
# --without-libsz -- omit GZIP

# Some options required only for macOSX
# -build=powerpc-apple
# --with-jdkclasses= # path to classes if not in 'jdk/lib'
# --with-javabin= # path to java bin, if not in 'jdk/bin'

Figure 1.


After configure runs successfully, the source is build with 'make'.

By default make builds all the Java classes and all the C code for the HDF4 and/or HDF5 library if selected. The Java classes are organized into six jar files:

The Java interface to the HDF4 Library (requires libhdf.so)
The Java interface to the HDF4 Library (requires libhdf5.so)
The Generic Data Object package (interfaces), ncsa.hdf.object (See <pointer>)
The implementation of the object package for HDF4 (requires jhdf.jar)
The implementation of the object package for HDF5 (requires jhdf5.jar)
The Java GUI, requires all jars and C libs.

The C code is organized into two libraries:

The JNI wrapper for HDF4, plus the HDF4 library (implements native calls for jhdf.jar)
The JNI wrapper for HDF45 plus the HDF4 library (implements native calls for jhdf5.jar)

By default, the 'make' builds the C and Java code to create all the jars and libraries.

Optionally, HDF4 or HDF5 may be omitted (--without-hdf4, --without-hdf5). In this case, the corresponding library and jar files will not be built.

The 'make install' installs the jar files and the C libraries in the directory specified by "--prefix" parameter in configure.

See the Makefile for other options.

Running the HDF Java code

The Java HDF has two parts, the Java classes and the C libraries. The Java classes are executed with a Java VM, just as any Java classes. The C libraries are loaded when the Java classes for the JNI are initialized. The C libraries must be in the library search path for the JVM.

Note that while the Java classes are the same for all platforms, there is a different C library for each platform.

Figure 2 shows an example shell script for running the hdfview on linux.  The search paths are set, and then the program is invoked with the correct paths.  Other Unix systems, macosx, and windows would have similar procedure, with details differing for each platform.


# where the HDFView is installed

# where Java is installed (requires jdk1.5.0 or above)

# all the jar files

# Example: includes netcdf and fits support

if test -z "$CLASSPATH" ; then

if test -n "$JAVAPATH" ; then
export PATH

if test -z "$LD_LIBRARY_PATH" ; then

# example: set path for linux


$JAVAPATH/java -mx512m ncsa.hdf.view.HDFView -root $HDFVIEW_HOME

Figure 2. Example shell script to launch the hdfview

In order to use the JNI interfaces in other products, the jhdf.jar and/or jhdf5.jar (HDF4, HDF5) archives must be in the class path for the Java compilation and run time.  The libjhdf.so and/or libjhdf5.so must be in the LD_LIBRARY_PATH for running.


The configure and make process has many dependencies, so it is prone to problems.

First, compiling the C into the libjhdf and libhdf5 must use the correct parameters to the C compiler. This depends on the Java Virtual Machine that will be used, as well as the platform. The C code is very portable, but different JVMs have different requirements for how to build loadable libraries. See the documentation for your JVM.

Second, there are many different versions of Java, some may not compile. The HDF Java products have been tested with Sun's Java Development Kit 1.3.1 and 1.4.1.

The most common problem is difficulty running the code after compilation. This is usually caused by problems with the class path or library path. When one of the jar files is missing from the class path, the program will encounter a "Class not found" exception.  A similar error will occur if the library with the native implementation is not found in the LD_LIBRARY_PATH.

The source distribution includes two small test programs that test the link paths (using the install directory specified in the configure). These are in the directory test/linktest. To check the installation, change to the directory, type 'make'. The 'testlink' program runs the TestHDF5Link.java program, and 'testlink4' runs TestHDF5Link.java. These Java program check the loading and invokation of the HDF library. If the link test succeeds, the paths in the script are correct for using the Java HDF products.

Warning: include(../includes/footer.html): failed to open stream: No such file or directory in /home/hdfgroup/public_ftp/HDF5/releases/HDF-JAVA/HDF-JAVA-2.6/src/build_src_unix.html on line 360

Warning: include(): Failed opening '../includes/footer.html' for inclusion (include_path='.:/usr/lib/php:/usr/local/lib/php') in /home/hdfgroup/public_ftp/HDF5/releases/HDF-JAVA/HDF-JAVA-2.6/src/build_src_unix.html on line 360