NCSA

HDF Home Page Frequently asked Questions about HDF


[ Index ] [ Products ] [ Newsletters ] [ Documentation ]

Last updated on January 23, 2001

Contents




1) What is HDF?

HDF stands for Hierarchical Data Format. It is a library and multi-object file format for the transfer of graphical and numerical data between machines.

It is freely available. The distribution consists of the HDF library, the HDF command line utilities, a test suite (source code only), a Java Interface and the Java-based HDF Viewer (JHV).

Features of the HDF File Format:

1a) Copyright Information

The COPYING file at the top of the HDF source code tree provides the copyright information regarding HDF.

2) What is in the HDF library?

HDF currently supports several data structure types: Scientific data sets (multi-dimensional arrays), vdatas (binary tables), "general" raster images, text entries (annotations), 8-bit raster images, 24-bit raster images, and color palettes.

HDF contains: the base library, the multi-file (SDS) library, the jpeg library, and the gzip library. HDF library functions can be called from C or FORTRAN user application programs.

It also contains the Java Products, which includes a Java HDF Interface and Java-based HDF viewer (JHV).

The base library contains a general purpose interface and application level interfaces, one for each data structure type. Each application level interface is specifically designed to read, write and manipulate one type. The general purpose interface contains functions, such as file I/O, error handling, memory management and physical storage.

The multi-file (SDS) library integrates the netCDF model with HDF Scientific data sets, and supports simultaneous access to multiple files and multiple objects. This part is referred to as the mfhdf library in the rest of this FAQ.

The jpeg and gzip libraries allow you to use jpeg and gzip compression for those application programming interfaces that support them.

3) What are the HDF command line utilities?

The HDF command line utilities are application programs that can be executed by entering them at the command level, just like UNIX commands.

There are HDF utilities to:

They provide capabilities for doing things with HDF files for which you would normally have to write your own program.

The 'hdp' utility is one of the more useful HDF utilities. Following is a description of its function:

The other utilities are:

In addition, the netCDF utilities, ncdump and ncgen, COMPILED with the HDF library, are included.

4) What is the latest official release of HDF, and what platforms does it support?

HDF version 4.1 Release 4 is the latest official release of HDF.

Please refer to the following page for information on the platforms we support:

     http://hdf.ncsa.uiuc.edu/platforms.html
Also check for any patches to the current released source code:
     ftp://ftp.ncsa.uiuc.edu/HDF/HDF/HDF_Current/patches/
   

5) What are the new features included in the current release?

Details are listed in the ./release_notes/ABOUT_4.1r4 file of the release.

6) Is there a Java Interface?

Yes, there is a Java Interface for HDF. For information on this please refer to the Java Products web page,
       http://hdf.ncsa.uiuc.edu/java-hdf-html/ 
and the Java Products FAQ,
       http://hdf.ncsa.uiuc.edu/java-hdf-html/java-hdf-faq.html

6a) Is HDF Y2K compliant?

Yes, it is Y2K compliant. There are dates used in strings within the HDF library and in the HDF tests. The strings in the HDF library are for identification purposes only, and the strings in the tests are for testing the use of global attributes in the netcdf interface. These strings just happen to have dates in them, but they could very well have any value.

Please see the HDF Y2K Compliance statement under Support Issues off of the HDF home page.

7) Where can I get the HDF source code and information relevant to HDF?

For information, take a look at the HDF home page:
       http://hdf.ncsa.uiuc.edu/
You can download the source code (free of charge) from the NCSA anonymous ftp server (ftp.ncsa.uiuc.edu) under:
       HDF/HDF/ 
The HDF FTP server address (hdf.ncsa.uiuc.edu) is directly linked to the HDF/ directory.

Other known mirror sites:

       NASA Goddard Space Flight Center:
          Host Name: daac.gsfc.nasa.gov
          Location:  software/hdf
       In Europe:
         DLR German Remote Sensing Data Center:
          Host Name: ftp.dfd.dlr.de
          Location:  software/hdf
          Web Page:  http://auc.dfd.dlr.de/HDF
The contents of HDF/ (HDF/) are:
 
    Documentation/ HDF documentation
    HDF4.1r3       Link to old release
    HDF4.1r4/      HDF 4.1 release 4 (latest official release)
    HDF_Current    Link to the latest official release
    README         
    contrib/       Contributions from HDF users outside NCSA
    java/          Java HDF Products (v2.5 for 4.1r3)
    newsletters    Link to HDF newsletters
    prev-releases/ Releases previous to the current release
    samples/       Example HDF applications and old HDF files. 
The HDF_Current link always points to the latest official release of HDF, which contains the following:
    bin/           Binaries for Unix and VMS 
    limits.txt     File describing the limits of the current release
    patches/       Patch files (if there are any) which can be used to 
                   patch the source code in unpacked/
    release_notes/ Link to the directory under unpacked, containing the
                   release notes and ABOUT* files.
    tar/           Packed source code for Unix and VMS
    unpacked/      UNIX unpacked source code
    zip/           Binaries for Window NT/95 
If you are connected to the Internet (NSFNET, ARPANET, MILNET, etc), you can log in to the NCSA or HDF ftp server by entering anonymous for the name and your local e-mail address (login@host) for the password.

If you do not have access to ftp or have problems downloading the software, contact hdfhelp@ncsa.uiuc.edu. We will try to accomodate you as best we can.

8) What documentation for HDF is available on the ftp server?

The HDF documentation is available on the NCSA ftp server (ftp.ncsa.uiuc.edu) in the directory:
   
       ftp://ftp.ncsa.uiuc.edu/HDF/HDF/Documentation/
  
If running HDF 4.1r4, following is the documentation which will be of use to you:

9) How do I get hard copies of HDF documentation?

NCSA does not provide hard copies of the HDF documentation. If you absolutely require a hard copy, please contact hdfhelp@ncsa.uiuc.edu and we will try to accomodate you.

10) How do I install HDF (4)?

To configure the distribution, type:
            ./configure -v --prefix=<install_path>
        or
            ./configure -v
If you do not specify a prefix, the binaries will be installed in the ./NewHDF file at the top level of the HDF source code.

Note: On UNIX systems without FORTRAN installed, the top level config/mh- will need to have the 'FC' macros defined to "NONE" for correct configuration.

To compile the library and utilities, type:

        make >& comp.out
To find out the available make targets, type:
        make help
To test the libraries, type:
        make test >& test.out
To install the libraries, utilities, includes and man pages, type: e.g.
        make install
See the README and the INSTALL files at the top level of HDF4.1r4 for detailed instructions on configuration and installation.

11) How do I compile application programs that call HDF functions?

To use HDF routines in your C program, you must either have the line #include "hdf.h" if you do not use the mfhdf library, or #include "mfhdf.h" if you do, near the beginning of your code.

On Unix, you must specify the following libraries when compiling and linking a program, in the following order:

            libmfhdf.a, libdf.a, libjpeg.a and libz.a
For NT/98/2000, please refer to the instructions included with these releases.

Detailed information on compiling an application program on specific platforms can be found in the ../HDF4.1r4/release_notes/compile.txt file.

In general, following is how you would compile on a Unix platform:

     For C:

        cc -o myprog myprog.c -I<path of HDF include directory> \
              <path of library>libmfhdf.a <path of library>libdf.a \
              <path of library>libjpeg.a <path of library>libz.a
       or

        cc -o myprog myprog.c -I<path of HDF include directory> \
              -L<path of libraries> -lmfhdf -ldf -ljpeg -lz

     For FORTRAN:

        f77 -o myprog myprog.f  \
              <path of library>libmfhdf.a <path of library>libdf.a \
              <path of library>libjpeg.a  <path of library>libz.a
       or

        f77 -o myprog myprog.f \
            -L<path of libraries> -lmfhdf -ldf -ljpeg -lz
NOTE: The order of the libraries is important: libmfhdf.a first, then libdf.a, followed by libjpeg.a and libz.a.

Applications that need netCDF or multi-file SDS functionality must link with both 'libmfhdf.a' and 'libdf.a' IN THIS ORDER. Applications that use neither of these interfaces can just link with the 'libdf.a' library for the base level of HDF functionality.

For FORTRAN programs, if your FORTRAN compiler accepts 'include' statements, you may include hdf.inc, dffunc.inc, and netcdf.inc in your program. Otherwise, you need to declare in your program all the constants used and functions called by the program.

12) Are there any conversion programs available to convert non-HDF image files into HDF files or vice versa?

Many of the HDF command line utilities are conversion programs. See Question #3 for more information regarding them.

Take a look at the What Software uses HDF? page off of the HDF home page for more information.

13) Which NCSA tools can I use to view HDF objects?

NCSA has several tools for scientific visualization that are based on HDF.

The latest tool available is:

Please see the What Software uses HDF? section on the home page for more information regarding these tools.

Older tools are available that have not been updated to use the current version of HDF:

These tools are available from the NCSA ftp server, ftp.ncsa.uiuc.edu, in:
      Visualization/      : Collage, Datascope, XDataslice
      SGI/                : Polyview
      Mosaic/             : Mosaic    

14) Is there any commercial or public domain visualization software that accepts HDF files?

Yes, there are numerous tools that accept HDF files. Please refer to the What Software uses HDF? section on our home page for more detailed, though not complete, information.

Commercial tools that accept HDF include IDL, Matlab, HDF Explorer and Noesys.

Public domain tools that accept HDF include, among others, NCSA tools, WebWinds, and OpenDX.

15) Can new versions of HDF read HDF files written using older versions of the HDF library?

Our goal is to make HDF backward compatible in the sense that HDF files can always be read by newer versions of HDF. We have succeeded in doing so up to HDF4.1r4, and will continue to follow the principle as much as possible. In many instances, HDF is also forward compatible, at least in regards to the data. Metadata, such as attributes, may not be readable by previous releases, but the data should be. Please see the notes following the table below for information on when the data is not forward compatible.

The table below lists the backward and forward compatibility of HDF in regards to the data (not metadata). The Vdata and Vgroup interfaces have been merged into HDF since HDF3.2. Before then, they were in a separate library named Vset.

                       |        CAN READ DATA FILES CREATED WITH  
         Interface     | HDF3.1  |  HDF3.2 |  HDF3.3  |  HDF4.0   | HDF4.1
        -------------------------------------------------------------------
        HDF3.1         |         |         |          |           |
         -RIS8         |  YES    |    YES  |  YES(1)  |  YES(1)   | YES(1)  
         -RIS24        |  YES    |    YES  |  YES(1)  |  YES(1)   | YES(1)  
         -PALETTE      |  YES    |    YES  |  YES     |  YES      | YES
         -ANNOTATION   |  YES    |    YES  |  YES     |  YES      | YES
         -SDS DFSD     | Float32 | Float32 | Float32  | Float32(2)| Float32(2,3)
        Vset 2.1       |         |         |          |           |
         -VData        |  YES    |    YES  |    YES   |  YES      | YES
         -Vgroup       |  YES    |    YES  |    YES   |  YES      | YES
        -------------------------------------------------------------------
        HDF3.2         |         |         |          |           |
         -RIS8         |  YES    |    YES  |  YES(1)  | YES(1)    | YES(1)
         -RIS24        |  YES    |    YES  |  YES(1)  | YES(1)    | YES(1)
         -PALETTE      |  YES    |    YES  |  YES     | YES       | YES
         -ANNOTATION   |  YES    |    YES  |  YES     | YES       | YES
         -SDS DFSD     |  YES    |    YES  |  YES     | YES(2)    | YES(2,3)
         -VData        |  YES    |    YES  |  YES     | YES       | YES
         -Vgroup       |  YES    |    YES  |  YES     | YES       | YES
        -------------------------------------------------------------------
        HDF3.3         |         |         |          |           |
         -RIS8         |  YES    |    YES  |  YES     | YES       | YES
         -RIS24        |  YES    |    YES  |  YES     | YES       | YES
         -PALETTE      |  YES    |    YES  |  YES     | YES       | YES
         -ANNOTATION   |  YES    |    YES  |  YES     | YES       | YES
         -SDS SD       |  YES    |    YES  |  YES     | YES(2)    | YES(2,3)
         -SDS DFSD     |  YES    |    YES  |  YES     | YES(2)    | YES(2,3) 
         -VData        |  YES    |    YES  |  YES     | YES       | YES
         -Vgroup       |  YES    |    YES  |  YES     | YES       | YES
        -------------------------------------------------------------------
        HDF4.0         |         |         |          |           |
         -GR           |  YES    |    YES  |  YES     | YES       | YES
         -RIS8         |  YES    |    YES  |  YES     | YES       | YES
         -RIS24        |  YES    |    YES  |  YES     | YES       | YES
         -PALETTE      |  YES    |    YES  |  YES     | YES       | YES
         -MFAN         |  YES    |    YES  |  YES     | YES       | YES
         -ANNOTATION   |  YES    |    YES  |  YES     | YES       | YES
         -SDS SD       |  YES    |    YES  |  YES     | YES       | YES(3) 
         -SDS DFSD     |  YES    |    YES  |  YES     | YES(2)    | YES(2,3) 
         -VData        |  YES    |    YES  |  YES     | YES       | YES
         -Vgroup       |  YES    |    YES  |  YES     | YES       | YES
        -------------------------------------------------------------------
        HDF4.1         |         |         |          |           |
         -GR           |  YES    |    YES  |  YES     | YES       | YES
         -RIS8         |  YES    |    YES  |  YES     | YES       | YES
         -RIS24        |  YES    |    YES  |  YES     | YES       | YES
         -PALETTE      |  YES    |    YES  |  YES     | YES       | YES
         -MFAN         |  YES    |    YES  |  YES     | YES       | YES
         -ANNOTATION   |  YES    |    YES  |  YES     | YES       | YES
         -SDS SD       |  YES    |    YES  |  YES     | YES       | YES
         -SDS DFSD     |  YES    |    YES  |  YES     | YES       | YES
         -VData        |  YES    |    YES  |  YES     | YES       | YES
         -Vgroup       |  YES    |    YES  |  YES     | YES       | YES

   (1)  except for JPEG compression
   (2)  except for gzip and nbit compression
   (3)  except for chunking and chunking with compression

   NOTES:
     - The table above does not include the low-level compression
       interface, which was introduced in HDF 4.0.
  
     - The SD interface should always be able to read an HDF file
       that was created with the DFSD interface. 

     - With HDF 4.1r2, the SD dimension representation introduced in 4.0r1
       will ONLY be used by default.  To be read by earlier versions of
       the software, the SDsetdimval_comp must be called to store the old
       and new dimension representations in an HDF file.
 
     - Old HDF libraries will NOT always be able to read HDF data written 
       by newer version HDF libraries. For example, HDF3.1 can not read 
       16-bit integer SDS's because HDF 3.1 did not support this data type. 

     - In HDF 4.1r1, chunking and Vdata/Vgroup attributes were
       added.  Previous releases will not be able to read data which
       was created using these features.

16) Can application programs which work with old versions of the HDF library always be compiled with new versions of HDF?

As HDF evolves some functions have to be changed or removed. For example, in HDF3.2 some functions' formal parameters which were passed by value in HDF3.1 have to be passed by reference in order to support new number types. When this happens, old application programs need to be modified so that they can work with the new library.

Our policy is as follows: Keep existing functions unchanged as much as possible; create new functions when necessary to accommodate new features; if a new function covers the feature of an existing old function, the old function should still be callable by old application programs; should an old function be phased out, the users will be forewarned and encouraged to switch to the new function; an old function will be removed from the library only if it is in conflict with the implementation of new features.

17) How does the 'integration of netCDF with HDF' affect application programmers?

The mfhdf library was designed to be completely transparent to the programmer. HDF supports a "multi-file" SDS interface and the complete netCDF interface as defined by Unidata netCDF Release 2.3.2.

Using either interface, you are able to read XDR-based netCDF files, HDF-based netCDF files and pre-HDF4.x HDF files. The library determines what type of file is being accessed and handles it appropriately. Any of the above types of files may be modified. However, the library will only create new files based on HDF (you can't create new XDR-based netCDF files).

Summary of HDF and XDR file interoperability for the HDF and netCDF application interfaces:

                 | Files created | Files created    |    Files written         |
                 | by DFSD       | by SD interface  |    by NC interface       |
                 | interface     |                  |                          |
                 |     HDF       |     HDF          | NCSA HDF | Unidata netCDF|   
                 | ----------------------------------------------------------- |
Accessed by DFSD |      Yes      |     Yes          |   Yes    |     No        |
                 |               |                  |          |               |
Accessed by SD   |      Yes      |     Yes          |   Yes    |     Yes       |
                 |               |                  |          |               |
Accessed by NC   |      Yes      |     Yes          |   Yes    |     Yes       |
                 |               |                  |          |               |
For more information, you can refer to the section entitled HDF Interface vs. netCDF Interface in the SD chapter of the User's Guide.

18) Does HDF support data compression?

HDF 4.0 (and later releases) supports a low-level compression interface, which allows any data-object to be compressed using a variety of algorithms.

Currently only three compression algorithms are supported: Run-Length Encoding (RLE), adaptive Huffman, and an LZ-77 dictionary coder (the gzip 'deflation' algorithm). Plans for future algorithms include an Lempel/Ziv-78 dictionary coding, an arithmetic coder and a faster Huffman algorithm.

HDF 4.0 (and later releases) supports n-bit compression for SDSs.

HDF 4.0 (and later releases) supports RLE (Run Length Encoding), IMCOMP, and JPEG compression for raster images.

New with HDF 4.1 is support for "chunking" and "chunking with compression". Data chunking allows an n-dimensional SDS or GR image to be stored as a series of n-dimensional chunks. See the HDF User's Guide for more information.

19) Is there a mailing list for HDF discussions and questions?

If you want to broadcast HDF technical questions to other HDF users in order to solicit their assistance, the sci.data.formats newsgroup is currently the most widely used avenue for this. This newsgroup is an appropriate forum for questions regarding file formats including, but not limited to, HDF.

"hdfnews" is a mailing list for HDF and its users to share ideas and information. HDF will announce new releases, updates, known bugs, etc. via newsletters through hdfnews. Users can make their comments, criticisms and suggestions on hdfnews. However, sending email to hdfnews@ncsa.uiuc.edu will broadcast your message to everyone on the hdfnews mailing list.

If you wish to be added to or removed from the "hdfnews" mailing list, send an e-mail message to ncsalist@ncsa.uiuc.edu with the appropriate command in the body of the message. Commands in the subject line are NOT processed. Valid commands include:

 
       subscribe hdfnews [<address>]
       subscribe hdfnews
       unsubscribe hdfnews [<address>]
       help
For example,
       subscribe hdfnews johndoe@uiuc.edu
We recommend that you send the "help" command to retrieve the full list of available commands.

If you have any problem subscribing/unsubscribing, feel free to contact hdfhelp@ncsa.uiuc.edu. To unsubscribe, the e-mail address you send must match exactly the e-mail address with which you subscribed. (If you do not specify an address, the address from which you send the e-mail must match your subscription address.)

20) How do I contribute my software to the HDF user community?

There are two ways that you can do this:

21) How do I make a bug report?

All bug reports, comments, suggestions and questions should go to hdfhelp@ncsa.uiuc.edu.

Attached below is a bug report template. It is very helpful to us as far as locating and fixing the bug if all the information inquired in the template is supplied by the reporter.

       ------------------  Template for bug report  ------------------------
       To: hdfhelp@ncsa.uiuc.edu
       Subject: [area]: [synopsis]   [replace with actual AREA and SYNOPSIS]

       VERSION:
          HDF4.1 release 4 

       USER:
              [Name, phone number and address of person reporting the bug.
               (email address if possible)]

       AREA:
              [Area of the HDF source tree affected, e.g., src, util, test, 
                toplevel. If there are bugs in more than one AREA, please use 
                a separate bug report for each AREA. ]

       SYNOPSIS:
              [Brief description of the problem and where it is located]

       MACHINE / OPERATING SYSTEM:
              [e.g. Solaris 2.7,  HP-UX 10.20 ...
              On Unix platforms, please include the output from
              "uname -a".]

       COMPILER:
              [e.g. native cc, native ANSI cc, gcc 2.63, MPW, ...]

       DESCRIPTION:
              [Detailed description of problem. ]

       REPEAT BUG BY:
              [What you did to get the error; include test program or session
               transcript if at all possible.  If you include a program, make
               sure it depends only on libraries in the HDF distribution, not
               on any vendor or third-party libraries.  Please be specific;
               if we can't reproduce it, we can't fix it. Tell us exactly what
               we should see when the program is run. ]

       SAMPLE FIX:
              [If available, please send context diffs (diff -c)]

       [PLEASE make your Subject(SYNOPSIS): line as descriptive as possible.]
       [Remove all the explanatory text in brackets before mailing.]
       [Send to hdfhelp@ncsa.uiuc.edu or to:

              NCSA
              HDF Group
              605 E. Springfield Ave. 
              Champaign, IL 61820 ]

       ------------------  End of Bug Report Template  ----------------------


NCSA
The National Center for Supercomputing Applications

University of Illinois at Urbana-Champaign

hdfhelp@ncsa.uiuc.edu
Last Modified: May 24, 2001