ddn-lib-hdf5 ~main

D language bindings and wrapper for HDF5


To use this package, run the following command in your project's root directory:

Manual usage
Put the following dependency into your project's dependences section:

ddn-lib-hdf5

D language bindings and wrapper for HDF5.

For more details about HDF5 visit https://www.hdfgroup.org/solutions/hdf5/ .

Overview

This library provides complete D language bindings to libhdf5 and libhdf5_hl, plus a high-level idiomatic D wrapper module ddn.data.hdf5.

Quick Start

If you do not have dub tool, you should install it by running something like dnf install dub. You should naturally have a D compiler. If you do not then dnf install gcc-gdc or dnf install ldc should do. All these packages are available on Debian derivatives, as well as many other Linux distributions.

Install HDF5 development libraries (see System Requirements), then follow these typical steps:

Create a New Project

dub init tryhdf5

This will create a new project with a dub.sdl or dub.json file, depending on your preference.

Add the Dependency

cd tryhdf5
dub add ddn-lib-hdf5

Write a Simple Program

Dub has already created the source/app.d file for you. Modify it to the following:

import ddn.data.hdf5;
import ddn.lib.hdf5.types: hsize_t;
import ddn.lib.hdf5.h5t: H5T_NATIVE_INT;

void main() {
    auto file = File.create("quickstart.h5");
    hsize_t[1] dims = [10];
    auto space = Dataspace.createSimple(dims);
    auto dset  = Dataset.create(file.handle(), "values", H5T_NATIVE_INT, space.handle());
    int[10] values = [0,1,2,3,4,5,6,7,8,9];
    dset.write(values.ptr, H5T_NATIVE_INT);
}

Build and Run

Now we can build and run the program with:

dub run

If everything worked, you should have quickstart.h5 file in your current directory. Let's examine it with:

my_server» h5dump quickstart.h5 
HDF5 "quickstart.h5" {
GROUP "/" {
   DATASET "values" {
      DATATYPE  H5T_STD_I32LE
      DATASPACE  SIMPLE { ( 10 ) / ( 10 ) }
      DATA {
      (0): 0, 1, 2, 3, 4, 5, 6, 7, 8, 9
      }
   }
}
}

Architecture

The library follows a three-layer design:

┌─────────────────────────────────────────────────────────┐
│              ddn.data.hdf5 (Wrapper Layer)              │
│         Idiomatic D API, RAII, Ranges, Exceptions       │
├─────────────────────────────────────────────────────────┤
│                    ddn.lib.hdf5                         │
│  ┌────────────────────┬────────────────────────────┐    │
│  │   Core Bindings    │   ddn.lib.hdf5.hl          │    │
│  │   (libhdf5)        │   (libhdf5_hl bindings)    │    │
│  └────────────────────┴────────────────────────────┘    │
├─────────────────────────────────────────────────────────┤
│                  libhdf5 + libhdf5_hl                   │
│                    (Native Libraries)                   │
└─────────────────────────────────────────────────────────┘

Repository Layout

  • src/ddn/lib/hdf5 — Low-level C API bindings to libhdf5
  • src/ddn/lib/hdf5/hl — High-Level API bindings to libhdf5_hl
  • src/ddn/data/hdf5 — Idiomatic D wrapper (RAII, ranges, exceptions)
  • src/ddn/tests — Test code (unit/integration)
  • demo/data — Runnable example programs
  • bin/ — Utility scripts (e.g., symbol helpers)

Features

  • Low-level bindings (ddn.lib.hdf5): Direct C API mapping with extern(C) @nogc nothrow functions
  • High-level bindings (ddn.lib.hdf5.hl): Bindings for libhdf5_hl (Lite, Tables, Images, Packet Tables, Dimension Scales)
  • Idiomatic wrapper (ddn.data.hdf5): RAII resource management, D ranges, exception-based error handling

Wrapper Modules

ModuleDescription
ddn.data.hdf5.fileFile operations with RAII, mounting, SWMR support
ddn.data.hdf5.groupGroup management with RAII
ddn.data.hdf5.datasetDataset I/O, chunking, hyperslab selection
ddn.data.hdf5.dataspaceDataspace creation and selection
ddn.data.hdf5.datatypeDatatype management
ddn.data.hdf5.attributeAttribute read/write with RAII
ddn.data.hdf5.filterCompression filters (GZIP, SZIP, shuffle, etc.)
ddn.data.hdf5.visitorIteration over attributes, links, and objects
ddn.data.hdf5.propertyProperty list management
ddn.data.hdf5.linkLink operations (hard, soft, external)
ddn.data.hdf5.h5objectGeneric HDF5 object operations
ddn.data.hdf5.packet_tablePacket table operations
ddn.data.hdf5.dimension_scaleDimension scale support
ddn.data.hdf5.tableTable operations (HDF5 HL)
ddn.data.hdf5.imageImage operations (HDF5 HL)

Installation

Add to your dub.sdl:

dependency "ddn-lib-hdf5" version="~>1.14.6"

Or dub.json:

{
    "dependencies": {
        "ddn-lib-hdf5": "~>1.14.6"
    }
}

System Requirements

  • HDF5 (1.10+) development headers and libraries (libhdf5)
  • High-Level library (libhdf5_hl)

On Fedora/RHEL:

sudo dnf install hdf5-devel
# Optional for static builds
sudo dnf install hdf5-static

On Debian/Ubuntu:

sudo apt-get install libhdf5-dev
# On some releases the HL library is included; if separate, also install:
# sudo apt-get install libhdf5-hl-dev

Usage Examples

Run the Examples

This repo contains runnable example programs under demo/data:

  • demo/data/hdf5_basic_io.d
  • demo/data/hdf5_groups.d
  • demo/data/hdf5_attributes.d
  • demo/data/hdf5_datasets.d
  • demo/data/hdf5_tables.d (HL)
  • demo/data/hdf5_compound_types.d
  • demo/data/hdf5_csv_load.d

Run an example via dub by pointing --single at the file:

# Example: run the basic I/O demo
dub run --single demo/data/hdf5_basic_io.d

# Or compile only
dub build --single demo/data/hdf5_groups.d -v

Basic File and Dataset Operations

import ddn.data.hdf5;

void main() {
    // Create a new HDF5 file (RAII - automatically closed)
    auto file = File.create("example.h5");

    // Create a group
    auto group = Group.create(file.handle(), "sensors");

    // Create a dataset
    hsize_t[2] dims = [100, 100];
    auto space = Dataspace.createSimple(dims);
    auto dataset = Dataset.create(group.handle(), "temperature",
        H5T_NATIVE_DOUBLE, space.handle());

    // Write data
    double[100 * 100] data;
    foreach (i; 0 .. data.length) data[i] = cast(double) i * 0.1;
    dataset.write(data.ptr, H5T_NATIVE_DOUBLE);
}

Chunked Datasets with Compression

import ddn.data.hdf5;

void main() {
    auto file = File.create("compressed.h5");

    hsize_t[2] dims = [1000, 1000];
    hsize_t[2] chunkDims = [100, 100];
    auto space = Dataspace.createSimple(dims);

    // Create property list with chunking and compression
    auto dcpl = H5Pcreate(H5P_DATASET_CREATE);
    scope(exit) H5Pclose(dcpl);

    H5Pset_chunk(dcpl, 2, chunkDims.ptr);

    if (Filter.isGzipAvailable()) {
        Filter.enableGzip(dcpl, 6);      // GZIP compression level 6
        Filter.enableShuffle(dcpl);       // Improves compression ratio
    }

    auto dataset = Dataset.create(file.handle(), "data",
        H5T_NATIVE_FLOAT, space.handle(), H5P_DEFAULT, dcpl, H5P_DEFAULT);

    // Or use the convenience method:
    auto dataset2 = Dataset.createChunked(file.handle(), "data2",
        H5T_NATIVE_FLOAT, space.handle(), chunkDims);
}

Hyperslab (Partial) I/O

import ddn.data.hdf5;

void main() {
    auto file = File.open("data.h5", FileAccessMode.readWrite);
    auto dataset = Dataset.open(file.handle(), "matrix");

    // Read a 10x10 block starting at position (50, 50)
    int[100] buffer;
    hsize_t[2] start = [50, 50];
    hsize_t[2] count = [10, 10];

    dataset.readHyperslab(buffer.ptr, H5T_NATIVE_INT, start, count);

    // Modify and write back
    foreach (ref val; buffer) val *= 2;
    dataset.writeHyperslab(buffer.ptr, H5T_NATIVE_INT, start, count);
}

Iterating Over File Contents

import ddn.data.hdf5;

void main() {
    auto file = File.open("data.h5", FileAccessMode.readOnly);

    // Visit all objects in the file
    visitObjects(file.handle(), (ObjectInfo info) {
        import std.stdio : writefln;
        writefln("Object: %s (type: %s)", info.name, info.type);
        return true;  // Continue iteration
    });

    // Iterate over links in root group
    iterateLinks(file.handle(), (LinkInfo info) {
        import std.stdio : writefln;
        writefln("Link: %s (type: %s)", info.name, info.type);
        return true;
    });

    // Iterate over attributes on a dataset
    auto dataset = Dataset.open(file.handle(), "data");
    iterateAttributes(dataset.handle(), (AttributeInfo info) {
        import std.stdio : writefln;
        writefln("Attribute: %s (size: %d)", info.name, info.dataSize);
        return true;
    });
}

File Mounting

import ddn.data.hdf5;

void main() {
    auto parent = File.open("parent.h5", FileAccessMode.readWrite);
    auto child = File.open("child.h5", FileAccessMode.readOnly);

    // Mount child file at /external
    parent.mount("external", child);

    // Access child's data through parent
    auto dataset = Dataset.open(parent.handle(), "external/data");

    // Unmount when done
    parent.unmount("external");
}

File Information

import ddn.data.hdf5;

void main() {
    auto file = File.open("data.h5", FileAccessMode.readOnly);

    // Get file properties
    auto size = file.getFileSize();
    auto name = file.getFileName();
    auto intent = file.getIntent();  // READ_ONLY or READ_WRITE
    auto freeSpace = file.getFreeSpace();

    // Get detailed file info
    auto info = file.getInfo();
    writefln("Superblock version: %d", info.superblock.version_);
    writefln("Free space: %d bytes", info.freeSpace.totalSpace);
}

Using Low-Level Bindings

For cases where you need direct access to the HDF5 C API:

import ddn.lib.hdf5;

void main() {
    // Direct C API calls
    auto fileId = H5Fcreate("example.h5", H5F_ACC.TRUNC, H5P_DEFAULT, H5P_DEFAULT);
    scope(exit) H5Fclose(fileId);

    // Create dataspace
    hsize_t[1] dims = [100];
    auto spaceId = H5Screate_simple(1, dims.ptr, null);
    scope(exit) H5Sclose(spaceId);

    // Create dataset
    auto dsetId = H5Dcreate2(fileId, "data", H5T_NATIVE_INT, spaceId,
        H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
    scope(exit) H5Dclose(dsetId);
}

Migration from Raw Bindings to Wrappers

Before (Raw Bindings)

import ddn.lib.hdf5;

void example() {
    auto fid = H5Fcreate("test.h5", H5F_ACC.TRUNC, H5P_DEFAULT, H5P_DEFAULT);
    if (fid < 0) {
        // Handle error manually
        return;
    }
    scope(exit) H5Fclose(fid);

    hsize_t[1] dims = [100];
    auto sid = H5Screate_simple(1, dims.ptr, null);
    if (sid < 0) { /* error */ }
    scope(exit) H5Sclose(sid);

    auto did = H5Dcreate2(fid, "data", H5T_NATIVE_INT, sid,
        H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
    if (did < 0) { /* error */ }
    scope(exit) H5Dclose(did);

    // ... use dataset ...
}

After (Wrappers)

import ddn.data.hdf5;

void example() {
    // RAII handles cleanup, exceptions handle errors
    auto file = File.create("test.h5");

    hsize_t[1] dims = [100];
    auto space = Dataspace.createSimple(dims);
    auto dataset = Dataset.create(file.handle(), "data",
        H5T_NATIVE_INT, space.handle());

    // ... use dataset ...
    // All resources automatically closed when scope exits
}

Key Differences

AspectRaw BindingsWrappers
Resource ManagementManual scope(exit)Automatic RAII
Error HandlingCheck return codesExceptions (HDF5Exception)
API StyleC-style functionsObject methods
Type SafetyRaw hid_t handlesTyped wrapper structs
Null ChecksManualBuilt-in isValid()

API Coverage

Low-Level Bindings (ddn.lib.hdf5)

ModuleHDF5 APICoverage
h5fH5F (Files)Complete
h5gH5G (Groups)Complete
h5dH5D (Datasets)Complete
h5sH5S (Dataspaces)Complete
h5tH5T (Datatypes)Complete
h5aH5A (Attributes)Complete
h5pH5P (Property Lists)Complete
h5lH5L (Links)Complete
h5oH5O (Objects)Complete
h5rH5R (References)Complete
h5zH5Z (Filters)Complete
h5eH5E (Errors)Complete
h5iH5I (Identifiers)Complete

High-Level Bindings (ddn.lib.hdf5.hl)

ModuleHDF5 HL APICoverage
h5ltH5LT (Lite)Complete
h5tbH5TB (Tables)Complete
h5imH5IM (Images)Complete
h5ptH5PT (Packet Tables)Complete
h5dsH5DS (Dimension Scales)Complete

Wrapper Layer (ddn.data.hdf5)

WrapperFeatures
FileCreate, open, close, flush, mount/unmount, SWMR, file info
GroupCreate, open, close
DatasetCreate, open, read/write, chunking, hyperslab I/O, chunk iteration
DataspaceCreate (simple, scalar, null), dimensions, selections
DatatypeNative types, compound types
AttributeCreate, open, read/write
FilterGZIP, SZIP, shuffle, Fletcher32, N-bit, scale-offset
VisitorIterate attributes, links; visit objects recursively
PropertyProperty list management
LinkHard, soft, external links
PacketTableAppend, read packets
DimensionScaleAttach/detach scales, labels

Configurations

The library supports two build configurations:

  • default: Links against dynamic libhdf5 and libhdf5_hl libraries
  • static: Links against static libhdf5.a and libhdf5_hl.a libraries

To build with static linking:

dub build --config=static

Building and Testing

Build the library:

dub build

Run unit tests:

dub test

Notes:

  • Static linking requires the static HDF5 libraries (see dub.sdl static configuration).
  • When statically linking, symbol order matters; the project’s dub.sdl already sets appropriate lflags order.

Running Tests

dub test

Notes for Contributors

Some modules include a small test helper import guarded by version(unittest). This is only compiled during tests and requires no action from users. To run all tests with verbose output:

dub test -v

Advanced: Symbol Helper Script

The bin/write-hdf5-symbols.sh script can be used by advanced users troubleshooting symbol resolution (e.g., static link scenarios) to dump referenced HDF5 symbols. This is optional and not required for normal library use.

License

This project is licensed under the Boost Software License 1.0 (BSD-3-Clause).

Contributing

Contributions are welcome! Please ensure your code follows the style guidelines in CODE_STYLE.md.

Authors:
  • DDN Team
Dependencies:
none
Versions:
1.14.6 2025-Dec-11
~main 2025-Dec-11
Show all 2 versions
Download Stats:
  • 0 downloads today

  • 5 downloads this week

  • 5 downloads this month

  • 5 downloads total

Score:
0.0
Short URL:
ddn-lib-hdf5.dub.pm