Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
SWIFTsim
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Deploy
Releases
Model registry
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
SWIFT
SWIFTsim
Commits
de3cc54d
Commit
de3cc54d
authored
7 years ago
by
Matthieu Schaller
Browse files
Options
Downloads
Patches
Plain Diff
Make the code crash if a compressed field is read with parallel-hdf5 version 1.10.2.
parent
d6e66ab9
No related branches found
No related tags found
1 merge request
!559
Hdf5 1.10.2 parallel read workaround
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
src/parallel_io.c
+48
-0
48 additions, 0 deletions
src/parallel_io.c
with
48 additions
and
0 deletions
src/parallel_io.c
+
48
−
0
View file @
de3cc54d
...
@@ -206,6 +206,54 @@ void readArray(hid_t grp, struct io_props props, size_t N, long long N_total,
...
@@ -206,6 +206,54 @@ void readArray(hid_t grp, struct io_props props, size_t N, long long N_total,
const
hid_t
h_data
=
H5Dopen2
(
grp
,
props
.
name
,
H5P_DEFAULT
);
const
hid_t
h_data
=
H5Dopen2
(
grp
,
props
.
name
,
H5P_DEFAULT
);
if
(
h_data
<
0
)
error
(
"Error while opening data space '%s'."
,
props
.
name
);
if
(
h_data
<
0
)
error
(
"Error while opening data space '%s'."
,
props
.
name
);
/* Parallel-HDF5 1.10.2 incorrectly reads data that was compressed */
/* We detect this here and crash with an error message instead of */
/* continuing with garbage data. */
#if H5_VERSION_LE(1, 10, 2) && H5_VERSION_GE(1, 10, 2)
if
(
mpi_rank
==
0
)
{
/* Recover the list of filters that were applied to the data */
const
hid_t
h_plist
=
H5Dget_create_plist
(
h_data
);
if
(
h_plist
<
0
)
error
(
"Error getting property list for data set '%s'"
,
props
.
name
);
/* Recover the number of filters in the list */
const
int
n_filters
=
H5Pget_nfilters
(
props
);
for
(
int
n
=
0
;
n
<
n_filters
;
++
n
)
{
unsigned
int
flag
;
size_t
cd_nelmts
=
10
;
unsigned
int
*
cd_values
=
malloc
(
cd_nelmts
*
sizeof
(
unsigned
int
));
size_t
namelen
=
256
;
char
*
name
=
calloc
(
namelen
*
sizeof
(
char
));
unsigned
int
filter_config
;
/* Recover the n^th filter in the list */
const
H5Z_filter_t
filter
=
H5Pget_filter
(
props
,
n
,
&
flag
,
&
cd_nelmts
,
cd_values
,
namelen
,
name
,
&
filter_config
);
if
(
filter
<
0
)
error
(
"Error retrieving %d^th (%d) filter for data set '%s'"
,
n
,
f_filters
,
props
.
name
);
/* Now check whether the deflate filter had been applied */
if
(
filter
==
H5Z_FILTER_DEFLATE
)
error
(
"HDF5 1.10.2 cannot correctly read data that was compressed with "
"the 'deflate' filter.
\n
The field '%s' has had this filter applied "
"and the code would silently read garbage into the particle arrays "
"so we'd rather stop here.You can:
\n
- Recompile the code with an "
"earlier or older version of HDF5.
\n
- Use the 'h5repack' tool to "
"remove the filter from the ICs."
,
props
.
name
);
free
(
name
);
free
(
cd_values
);
}
}
#endif
/* Create property list for collective dataset read. */
/* Create property list for collective dataset read. */
const
hid_t
h_plist_id
=
H5Pcreate
(
H5P_DATASET_XFER
);
const
hid_t
h_plist_id
=
H5Pcreate
(
H5P_DATASET_XFER
);
H5Pset_dxpl_mpio
(
h_plist_id
,
H5FD_MPIO_COLLECTIVE
);
H5Pset_dxpl_mpio
(
h_plist_id
,
H5FD_MPIO_COLLECTIVE
);
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment