Skip to content
GitLab
Projects Groups Topics Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • SWIFTsim SWIFTsim
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributor statistics
    • Graph
    • Compare revisions
  • Issues 53
    • Issues 53
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 18
    • Merge requests 18
  • Deployments
    • Deployments
    • Releases
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
Collapse sidebar
  • SWIFT
  • SWIFTsimSWIFTsim
  • Issues
  • #116
Closed
Open
Issue created Mar 07, 2016 by Bert Vandenbroucke@bvandenbrouckeMaintainer

Default chunk size in single_io too large for small particle numbers

One of my GIZMO tests with only 50,000 particles crashes when writing a snapshot. Relevant part of the output:

  #010: ../../../src/H5Dchunk.c line 540 in H5D__chunk_construct(): chunk size must be <= maximum dimension size for fixed-sized dimensions
    major: Dataset
    minor: Unable to initialize object
single_io.c:writeArrayBackEnd():228: Error while creating dataspace 'Coordinates'.

The problem is solved by changing chunk_shape[0] = 1 << 16 in single_io.c to a smaller value. I suggest adding a check that resets chunk_shape[0] to N if N is smaller than 1 << 16.

To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
Time tracking