Skip to content

Add dedicated zlib codec option#578

Merged
MarkRivers merged 6 commits intoareaDetector:masterfrom
jwlodek:zlib-codec
Apr 20, 2026
Merged

Add dedicated zlib codec option#578
MarkRivers merged 6 commits intoareaDetector:masterfrom
jwlodek:zlib-codec

Conversation

@jwlodek
Copy link
Copy Markdown
Member

@jwlodek jwlodek commented Apr 16, 2026

Working with a detector that produces zlib compressed arrays, would like to allow for marking the NDArray as ZLIB compressed. Still a WIP, needs testing. Tested with arrays directly from detector and with NDPluginCodec, with decompression in ImageJ.

@jwlodek
Copy link
Copy Markdown
Member Author

jwlodek commented Apr 20, 2026

I've added unit tests for NDPluginCodec which test the new zlib compress/decompress option (as well as all the other compression algorithms). Will keep the PR marked as a draft until I have a chance to test with my detector.

@MarkRivers
Copy link
Copy Markdown
Member

@jwlodek this is great.

For consistency with other plugins I think it also needs:

  • Adding ZlibCLevel to NDPluginCodec.adl.
  • Adding zlib code to ADSupport so it can be built on Windows and other platforms without packages.
  • If the zlib library does not already contain a single function that can be called to decompress an array then this needs to be added and built as a shareable library that can be called from Java and Python.
  • ADViewers needs to add ImageJ support for decompressing NTNDArrays that are compressed with zlib.

@jwlodek
Copy link
Copy Markdown
Member Author

jwlodek commented Apr 20, 2026

image I've updated the ADL screen with the zlib compression level, would appreciate a look at it since I haven't done much editing of MEDM screens. For the autoconversion, would you be able to run that by any chance? I only have the environment for phoebus autoconversion set up at the moment, not all of the other formats.

@MarkRivers
Copy link
Copy Markdown
Member

That looks fine. I will be happy to do the autoconversion as soon as this is merged.

@jwlodek
Copy link
Copy Markdown
Member Author

jwlodek commented Apr 20, 2026

ADSupport already seems to build zlib - so should be OK on that front, and I made a PR adding the support for this in the viewers: areaDetector/ADViewers#29. Need to make an end-to-end test, will try to get to that by the end of this week.

@jwlodek
Copy link
Copy Markdown
Member Author

jwlodek commented Apr 20, 2026

image

Was able to test with my detector outputting zlib compressed arrays over PVA to ImageJ.

Also tested NDPluginCodec decoding the zlib compressed frame, a second one compressing it back with zlib with higher compression level, and outputting to PVA:

image image

It all seems to work, but I'd appreciate a second live test.

@jwlodek jwlodek marked this pull request as ready for review April 20, 2026 20:10
@jwlodek
Copy link
Copy Markdown
Member Author

jwlodek commented Apr 20, 2026

Ok, so everything is working except writing Zlib pre-compressed arrays with the HDF5 plugin. Need to double check that.

@jwlodek
Copy link
Copy Markdown
Member Author

jwlodek commented Apr 20, 2026

image

It seems if I create the compressed array in NDPluginCodec and pass to the HDF plugin, it works without an issue. If I feed it the compressed array from my driver, I get:

INFO | ADXSPD::acquisitionThread: Received frame number 1, trigger number 0, status code 0, size 205, 18369 bytes
2026/04/20 16:42:46.558 NDFileHDF5Dataset::writeFile ERROR Unable to write pre-compressed data - mismatched chunk definition
2026/04/20 16:42:46.558 NDFileHDF5::writeFile ERROR: could not write to dataset. Aborting
2026/04/20 16:42:46.558 NDPluginFile::writeFileBase Error writing file, status=3
./st.cmd: line 6: 3326845 Segmentation fault      (core dumped) ../../bin/linux-x86_64/xspdApp st_base.cmd

Will investigate further tomorrow.

@MarkRivers
Copy link
Copy Markdown
Member

MarkRivers commented Apr 20, 2026

Ok, so everything is working except writing Zlib pre-compressed arrays with the HDF5 plugin. Need to double check that.

That could be due to a similar issue with lz4 and bslz4 where one needs to add a header to the compressed data block:

else if (pArray->codec.name == codecName[NDCODEC_LZ4]) {

else if (pArray->codec.name == codecName[NDCODEC_BSLZ4]) {

@jwlodek
Copy link
Copy Markdown
Member Author

jwlodek commented Apr 20, 2026

Ok, I was able to resolve it. The issue was I was setting pArray->codec.compressor = NDCODEC_ZLIB in my driver when I shouldn't have been. If I remove that it works without an issue. I think this is ready for review and third party test.

Essentially the == operator was checking all the fields for codec, and since on the driver side it was set to 3 (Zlib) and on the HDF side it was expecting -1 (unset), it thought that the codecs didn't match.

@MarkRivers
Copy link
Copy Markdown
Member

This looks good to me. I will merge it.

@MarkRivers MarkRivers merged commit a75009b into areaDetector:master Apr 20, 2026
8 checks passed
@MarkRivers
Copy link
Copy Markdown
Member

I tested with ADSimDetector and was able to display zlib compressed data in ImageJ and stream to an HDF5 file. h5dump reported the correct information:

         DATASET "data" {
            DATATYPE  H5T_STD_U8LE
            DATASPACE  SIMPLE { ( 10, 1024, 1024 ) / ( 10, 1024, 1024 ) }
            STORAGE_LAYOUT {
               CHUNKED ( 1, 1024, 1024 )
               SIZE 3513953 (2.984:1 COMPRESSION)
            }
            FILTERS {
               COMPRESSION DEFLATE { LEVEL 3 }
            }
            FILLVALUE {
               FILL_TIME H5D_FILL_TIME_IFSET
               VALUE  0
            }
            ALLOCATION_TIME {
               H5D_ALLOC_TIME_INCR
            }
            DATA {
            (0,0,0): 238, 240, 241, 242, 242, 243, 245, 245, 247, 247, 249,
            (0,0,11): 250, 250, 251, 252, 253, 254, 255, 1, 1, 2, 4, 4, 5, 6,
            (0,0,25): 7, 8, 10, 11, 11, 12, 13, 14, 16, 16, 18, 18, 20, 20,

@MarkRivers
Copy link
Copy Markdown
Member

I tweaked NDPCodec.adl and did the autoconverts to other OPIs. Pushed to master.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants