Avoid crash when writing healpix maps caused by HDF5 chunk size larger than the dataset
Fixes #862 (closed).
When nside is small the lightcone map HDF5 dataset written by one MPI rank can be smaller than the chunk size. In this case we need to reduce the chunk size to the dataset size or else H5Dcreate() will fail.
This merge request also disables chunking of the map datasets when no filters are in use.
Merge request reports
Activity
assigned to @jch
This should fix #862 (closed) but I still need to check that I can reproduce the issue and that this branch resolves it.
requested review from @rttw52
If I run the SmallCosmovolume_lightcone example with master using nside=4 and hdf5_chunk_size=16384 it crashes as expected:
#012: H5Doh.c line 273 in H5O__dset_create(): unable to create dataset major: Dataset minor: Unable to initialize object #013: H5Dint.c line 1296 in H5D__create(): unable to construct layout information major: Dataset minor: Unable to initialize object #014: H5Dchunk.c line 855 in H5D__chunk_construct(): chunk size must be <= maximum dimension size for fixed-sized dimensions major: Dataset minor: Unable to initialize object [0001] [01355.9] ../../src/lightcone/lightcone_map.c:lightcone_map_write():253: Unable to create dataset TotalMass Abort(-1) on node 1 (rank 1 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, -1) - process 1
The changes in this branch prevent the crash.
@rttw52 if that works for you, I'll merge.
@rttw52 ?
@rttw52 seems to confirm this works.
I guess we could also stop users from demanding very small
nside
.mentioned in commit 2125f327