ddn-lib-hdf5 ~main
D language bindings and wrapper for HDF5
To use this package, run the following command in your project's root directory:
Manual usage
Put the following dependency into your project's dependences section:
ddn-lib-hdf5
D language bindings and wrapper for HDF5.
For more details about HDF5 visit https://www.hdfgroup.org/solutions/hdf5/ .
Overview
This library provides complete D language bindings to libhdf5 and libhdf5_hl, plus a high-level idiomatic D wrapper module ddn.data.hdf5.
Quick Start
If you do not have dub tool, you should install it by running something like dnf install dub.
You should naturally have a D compiler. If you do not then dnf install gcc-gdc
or dnf install ldc should do. All these packages are available on Debian derivatives, as well as
many other Linux distributions.
Install HDF5 development libraries (see System Requirements), then follow these typical steps:
Create a New Project
dub init tryhdf5
This will create a new project with a dub.sdl or dub.json file, depending on your preference.
Add the Dependency
cd tryhdf5
dub add ddn-lib-hdf5
Write a Simple Program
Dub has already created the source/app.d file for you. Modify it to the following:
import ddn.data.hdf5;
import ddn.lib.hdf5.types: hsize_t;
import ddn.lib.hdf5.h5t: H5T_NATIVE_INT;
void main() {
auto file = File.create("quickstart.h5");
hsize_t[1] dims = [10];
auto space = Dataspace.createSimple(dims);
auto dset = Dataset.create(file.handle(), "values", H5T_NATIVE_INT, space.handle());
int[10] values = [0,1,2,3,4,5,6,7,8,9];
dset.write(values.ptr, H5T_NATIVE_INT);
}
Build and Run
Now we can build and run the program with:
dub run
If everything worked, you should have quickstart.h5 file in your current directory.
Let's examine it with:
my_server» h5dump quickstart.h5
HDF5 "quickstart.h5" {
GROUP "/" {
DATASET "values" {
DATATYPE H5T_STD_I32LE
DATASPACE SIMPLE { ( 10 ) / ( 10 ) }
DATA {
(0): 0, 1, 2, 3, 4, 5, 6, 7, 8, 9
}
}
}
}
Architecture
The library follows a three-layer design:
┌─────────────────────────────────────────────────────────┐
│ ddn.data.hdf5 (Wrapper Layer) │
│ Idiomatic D API, RAII, Ranges, Exceptions │
├─────────────────────────────────────────────────────────┤
│ ddn.lib.hdf5 │
│ ┌────────────────────┬────────────────────────────┐ │
│ │ Core Bindings │ ddn.lib.hdf5.hl │ │
│ │ (libhdf5) │ (libhdf5_hl bindings) │ │
│ └────────────────────┴────────────────────────────┘ │
├─────────────────────────────────────────────────────────┤
│ libhdf5 + libhdf5_hl │
│ (Native Libraries) │
└─────────────────────────────────────────────────────────┘
Repository Layout
src/ddn/lib/hdf5— Low-level C API bindings tolibhdf5src/ddn/lib/hdf5/hl— High-Level API bindings tolibhdf5_hlsrc/ddn/data/hdf5— Idiomatic D wrapper (RAII, ranges, exceptions)src/ddn/tests— Test code (unit/integration)demo/data— Runnable example programsbin/— Utility scripts (e.g., symbol helpers)
Features
- Low-level bindings (
ddn.lib.hdf5): Direct C API mapping withextern(C) @nogc nothrowfunctions - High-level bindings (
ddn.lib.hdf5.hl): Bindings for libhdf5_hl (Lite, Tables, Images, Packet Tables, Dimension Scales) - Idiomatic wrapper (
ddn.data.hdf5): RAII resource management, D ranges, exception-based error handling
Wrapper Modules
| Module | Description |
|---|---|
ddn.data.hdf5.file | File operations with RAII, mounting, SWMR support |
ddn.data.hdf5.group | Group management with RAII |
ddn.data.hdf5.dataset | Dataset I/O, chunking, hyperslab selection |
ddn.data.hdf5.dataspace | Dataspace creation and selection |
ddn.data.hdf5.datatype | Datatype management |
ddn.data.hdf5.attribute | Attribute read/write with RAII |
ddn.data.hdf5.filter | Compression filters (GZIP, SZIP, shuffle, etc.) |
ddn.data.hdf5.visitor | Iteration over attributes, links, and objects |
ddn.data.hdf5.property | Property list management |
ddn.data.hdf5.link | Link operations (hard, soft, external) |
ddn.data.hdf5.h5object | Generic HDF5 object operations |
ddn.data.hdf5.packet_table | Packet table operations |
ddn.data.hdf5.dimension_scale | Dimension scale support |
ddn.data.hdf5.table | Table operations (HDF5 HL) |
ddn.data.hdf5.image | Image operations (HDF5 HL) |
Installation
Add to your dub.sdl:
dependency "ddn-lib-hdf5" version="~>1.14.6"
Or dub.json:
{
"dependencies": {
"ddn-lib-hdf5": "~>1.14.6"
}
}
System Requirements
- HDF5 (1.10+) development headers and libraries (
libhdf5) - High-Level library (
libhdf5_hl)
On Fedora/RHEL:
sudo dnf install hdf5-devel
# Optional for static builds
sudo dnf install hdf5-static
On Debian/Ubuntu:
sudo apt-get install libhdf5-dev
# On some releases the HL library is included; if separate, also install:
# sudo apt-get install libhdf5-hl-dev
Usage Examples
Run the Examples
This repo contains runnable example programs under demo/data:
demo/data/hdf5_basic_io.ddemo/data/hdf5_groups.ddemo/data/hdf5_attributes.ddemo/data/hdf5_datasets.ddemo/data/hdf5_tables.d(HL)demo/data/hdf5_compound_types.ddemo/data/hdf5_csv_load.d
Run an example via dub by pointing --single at the file:
# Example: run the basic I/O demo
dub run --single demo/data/hdf5_basic_io.d
# Or compile only
dub build --single demo/data/hdf5_groups.d -v
Basic File and Dataset Operations
import ddn.data.hdf5;
void main() {
// Create a new HDF5 file (RAII - automatically closed)
auto file = File.create("example.h5");
// Create a group
auto group = Group.create(file.handle(), "sensors");
// Create a dataset
hsize_t[2] dims = [100, 100];
auto space = Dataspace.createSimple(dims);
auto dataset = Dataset.create(group.handle(), "temperature",
H5T_NATIVE_DOUBLE, space.handle());
// Write data
double[100 * 100] data;
foreach (i; 0 .. data.length) data[i] = cast(double) i * 0.1;
dataset.write(data.ptr, H5T_NATIVE_DOUBLE);
}
Chunked Datasets with Compression
import ddn.data.hdf5;
void main() {
auto file = File.create("compressed.h5");
hsize_t[2] dims = [1000, 1000];
hsize_t[2] chunkDims = [100, 100];
auto space = Dataspace.createSimple(dims);
// Create property list with chunking and compression
auto dcpl = H5Pcreate(H5P_DATASET_CREATE);
scope(exit) H5Pclose(dcpl);
H5Pset_chunk(dcpl, 2, chunkDims.ptr);
if (Filter.isGzipAvailable()) {
Filter.enableGzip(dcpl, 6); // GZIP compression level 6
Filter.enableShuffle(dcpl); // Improves compression ratio
}
auto dataset = Dataset.create(file.handle(), "data",
H5T_NATIVE_FLOAT, space.handle(), H5P_DEFAULT, dcpl, H5P_DEFAULT);
// Or use the convenience method:
auto dataset2 = Dataset.createChunked(file.handle(), "data2",
H5T_NATIVE_FLOAT, space.handle(), chunkDims);
}
Hyperslab (Partial) I/O
import ddn.data.hdf5;
void main() {
auto file = File.open("data.h5", FileAccessMode.readWrite);
auto dataset = Dataset.open(file.handle(), "matrix");
// Read a 10x10 block starting at position (50, 50)
int[100] buffer;
hsize_t[2] start = [50, 50];
hsize_t[2] count = [10, 10];
dataset.readHyperslab(buffer.ptr, H5T_NATIVE_INT, start, count);
// Modify and write back
foreach (ref val; buffer) val *= 2;
dataset.writeHyperslab(buffer.ptr, H5T_NATIVE_INT, start, count);
}
Iterating Over File Contents
import ddn.data.hdf5;
void main() {
auto file = File.open("data.h5", FileAccessMode.readOnly);
// Visit all objects in the file
visitObjects(file.handle(), (ObjectInfo info) {
import std.stdio : writefln;
writefln("Object: %s (type: %s)", info.name, info.type);
return true; // Continue iteration
});
// Iterate over links in root group
iterateLinks(file.handle(), (LinkInfo info) {
import std.stdio : writefln;
writefln("Link: %s (type: %s)", info.name, info.type);
return true;
});
// Iterate over attributes on a dataset
auto dataset = Dataset.open(file.handle(), "data");
iterateAttributes(dataset.handle(), (AttributeInfo info) {
import std.stdio : writefln;
writefln("Attribute: %s (size: %d)", info.name, info.dataSize);
return true;
});
}
File Mounting
import ddn.data.hdf5;
void main() {
auto parent = File.open("parent.h5", FileAccessMode.readWrite);
auto child = File.open("child.h5", FileAccessMode.readOnly);
// Mount child file at /external
parent.mount("external", child);
// Access child's data through parent
auto dataset = Dataset.open(parent.handle(), "external/data");
// Unmount when done
parent.unmount("external");
}
File Information
import ddn.data.hdf5;
void main() {
auto file = File.open("data.h5", FileAccessMode.readOnly);
// Get file properties
auto size = file.getFileSize();
auto name = file.getFileName();
auto intent = file.getIntent(); // READ_ONLY or READ_WRITE
auto freeSpace = file.getFreeSpace();
// Get detailed file info
auto info = file.getInfo();
writefln("Superblock version: %d", info.superblock.version_);
writefln("Free space: %d bytes", info.freeSpace.totalSpace);
}
Using Low-Level Bindings
For cases where you need direct access to the HDF5 C API:
import ddn.lib.hdf5;
void main() {
// Direct C API calls
auto fileId = H5Fcreate("example.h5", H5F_ACC.TRUNC, H5P_DEFAULT, H5P_DEFAULT);
scope(exit) H5Fclose(fileId);
// Create dataspace
hsize_t[1] dims = [100];
auto spaceId = H5Screate_simple(1, dims.ptr, null);
scope(exit) H5Sclose(spaceId);
// Create dataset
auto dsetId = H5Dcreate2(fileId, "data", H5T_NATIVE_INT, spaceId,
H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
scope(exit) H5Dclose(dsetId);
}
Migration from Raw Bindings to Wrappers
Before (Raw Bindings)
import ddn.lib.hdf5;
void example() {
auto fid = H5Fcreate("test.h5", H5F_ACC.TRUNC, H5P_DEFAULT, H5P_DEFAULT);
if (fid < 0) {
// Handle error manually
return;
}
scope(exit) H5Fclose(fid);
hsize_t[1] dims = [100];
auto sid = H5Screate_simple(1, dims.ptr, null);
if (sid < 0) { /* error */ }
scope(exit) H5Sclose(sid);
auto did = H5Dcreate2(fid, "data", H5T_NATIVE_INT, sid,
H5P_DEFAULT, H5P_DEFAULT, H5P_DEFAULT);
if (did < 0) { /* error */ }
scope(exit) H5Dclose(did);
// ... use dataset ...
}
After (Wrappers)
import ddn.data.hdf5;
void example() {
// RAII handles cleanup, exceptions handle errors
auto file = File.create("test.h5");
hsize_t[1] dims = [100];
auto space = Dataspace.createSimple(dims);
auto dataset = Dataset.create(file.handle(), "data",
H5T_NATIVE_INT, space.handle());
// ... use dataset ...
// All resources automatically closed when scope exits
}
Key Differences
| Aspect | Raw Bindings | Wrappers |
|---|---|---|
| Resource Management | Manual scope(exit) | Automatic RAII |
| Error Handling | Check return codes | Exceptions (HDF5Exception) |
| API Style | C-style functions | Object methods |
| Type Safety | Raw hid_t handles | Typed wrapper structs |
| Null Checks | Manual | Built-in isValid() |
API Coverage
Low-Level Bindings (ddn.lib.hdf5)
| Module | HDF5 API | Coverage |
|---|---|---|
h5f | H5F (Files) | Complete |
h5g | H5G (Groups) | Complete |
h5d | H5D (Datasets) | Complete |
h5s | H5S (Dataspaces) | Complete |
h5t | H5T (Datatypes) | Complete |
h5a | H5A (Attributes) | Complete |
h5p | H5P (Property Lists) | Complete |
h5l | H5L (Links) | Complete |
h5o | H5O (Objects) | Complete |
h5r | H5R (References) | Complete |
h5z | H5Z (Filters) | Complete |
h5e | H5E (Errors) | Complete |
h5i | H5I (Identifiers) | Complete |
High-Level Bindings (ddn.lib.hdf5.hl)
| Module | HDF5 HL API | Coverage |
|---|---|---|
h5lt | H5LT (Lite) | Complete |
h5tb | H5TB (Tables) | Complete |
h5im | H5IM (Images) | Complete |
h5pt | H5PT (Packet Tables) | Complete |
h5ds | H5DS (Dimension Scales) | Complete |
Wrapper Layer (ddn.data.hdf5)
| Wrapper | Features |
|---|---|
File | Create, open, close, flush, mount/unmount, SWMR, file info |
Group | Create, open, close |
Dataset | Create, open, read/write, chunking, hyperslab I/O, chunk iteration |
Dataspace | Create (simple, scalar, null), dimensions, selections |
Datatype | Native types, compound types |
Attribute | Create, open, read/write |
Filter | GZIP, SZIP, shuffle, Fletcher32, N-bit, scale-offset |
Visitor | Iterate attributes, links; visit objects recursively |
Property | Property list management |
Link | Hard, soft, external links |
PacketTable | Append, read packets |
DimensionScale | Attach/detach scales, labels |
Configurations
The library supports two build configurations:
- default: Links against dynamic
libhdf5andlibhdf5_hllibraries - static: Links against static
libhdf5.aandlibhdf5_hl.alibraries
To build with static linking:
dub build --config=static
Building and Testing
Build the library:
dub build
Run unit tests:
dub test
Notes:
- Static linking requires the static HDF5 libraries (see
dub.sdlstaticconfiguration). - When statically linking, symbol order matters; the project’s
dub.sdlalready sets appropriatelflagsorder.
Running Tests
dub test
Notes for Contributors
Some modules include a small test helper import guarded by version(unittest). This is only compiled during tests and requires no action from users. To run all tests with verbose output:
dub test -v
Advanced: Symbol Helper Script
The bin/write-hdf5-symbols.sh script can be used by advanced users troubleshooting symbol resolution (e.g., static link scenarios) to dump referenced HDF5 symbols. This is optional and not required for normal library use.
License
This project is licensed under the Boost Software License 1.0 (BSD-3-Clause).
Contributing
Contributions are welcome! Please ensure your code follows the style guidelines in CODE_STYLE.md.
- ~main released 2 days ago
- ddn/ddn-lib-hdf5
- BSD-3-Clause
- Authors:
- Dependencies:
- none
- Versions:
-
Show all 2 versions1.14.6 2025-Dec-11 ~main 2025-Dec-11 - Download Stats:
-
-
0 downloads today
-
5 downloads this week
-
5 downloads this month
-
5 downloads total
-
- Score:
- 0.0
- Short URL:
- ddn-lib-hdf5.dub.pm