Navigate back: Main / Cookbook
Attributes
Creating "Large" HDF5 Attributes
- Problem
- You would like to use HDF5 attributes the size of whose values exceeds a few dozen kilobytes
- Solution
- A file format change in the HDF5 1.8.x family of library releases made it possible to have attributes larger than about 64 KiB. Ensure that the lower library version bound for new HDF5 item creation is at least 1.8.x, and create larger attributes as usual.
In the example below, we create an attribute whose value occupies 4 MiB.
- Note
- This feature is only supported in HDF5 1.8.x+
15 {
16 __label__ fail_attr, fail_aspace, fail_fapl, fail_file;
17 hid_t fapl, file, aspace, attr;
18
20 ret_val = EXIT_FAILURE;
21 goto fail_fapl;
22 }
23#if H5_VERSION_GE(1, 10, 0)
25#elif H5_VERSION_GE(1, 8, 0)
27#else
28#error Only HDF5 1.8.x and later supported.
29#endif
30 ret_val = EXIT_FAILURE;
31 goto fail_file;
32 }
34 ret_val = EXIT_FAILURE;
35 goto fail_file;
36 }
37
39 ret_val = EXIT_FAILURE;
40 goto fail_aspace;
41 }
43 ret_val = EXIT_FAILURE;
44 goto fail_attr;
45 }
46
48fail_attr:
50fail_aspace:
52fail_file:
54fail_fapl:;
55 }
- Discussion
- Large attributes are supported only in HDF5 versions 1.8.x and higher. This has implications for the accessibility of your HDF5 files and is your call.
Since there are no size limitations for large attributes, it might seem tempting to mistake them for dataset stand-ins. This is not the case, for at least two reasons:
- Attributes decorate HDF5 objects, have their own local namespace, and can't be link targets.
- Attribute I/O treats the attribute value as atomic, i.e., there is no support for partial I/O. A large attribute will always be read or written in its entirety.
- See Also
- See Maintaining Compatibility with other HDF5 Library Versions for HDF5 compatibility implications.
Navigate back: Main / Cookbook