Skip to content
Snippets Groups Projects

WIP: Concurrent hydro interactions via particle-carried spin locks

Closed Matthieu Schaller requested to merge locked_hydro into master
1 unresolved thread

Here is a first-draft implementation of the atomic tasks for the hydro scheme.

To simplify things, each particle carries a lock and we try to acquire it in the tasks when we identify that a given particle will be updated. This reduces the amount of unnecessary locking of an entire array of particles when for instance writing back from a cache.

Thoughts welcome on this! It does yield the correct answer and is a bit faster. Only a bit as the hydro was already pretty good. @pdraper any thoughts from you as well on this?

Note for self: Need to deal with MPI. Probably by re-initialising the lock upon receiving the data.

Edited by Matthieu Schaller

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
src/part_lock.h 0 → 100644
46 * @brief Initialise the lock of a particle.
47 *
48 * @param p Pointer to the particle (part, spart, gpart, bpart,...)
49 */
50 #define swift_particle_lock_init(p) ((p)->lock = 0)
51
52 /**
53 * @brief Lock a particle for access within this thread.
54 *
55 * The thread will spin until the lock can be aquired.
56 *
57 * @param p Pointer to the particle (part, spart, gpart, bpart,...)
58 */
59 #define swift_particle_lock_lock(p) \
60 ({ \
61 while (atomic_cas(&((p)->lock), 0, 1) != 0) { \
  • added 2 commits

    • 3a7b5714 - Use the thread ID as the value of the particle-carried lock when locked
    • df745601 - Update the SPHENIX scheme to use the new checks

    Compare with previous version

  • Matthieu Schaller changed the description

    changed the description

  • Matthieu Schaller marked as a Work In Progress

    marked as a Work In Progress

  • Please register or sign in to reply
    Loading