Skip to content

Mpi periodic gravity

Matthieu Schaller requested to merge mpi_periodic_gravity into master

Here is where I am at with the split periodic gravity calculation over MPI.

I can survive for some steps but I can get stuck in reproducible ways. For instance, running mpirun -np 4 swift_mpi -s -S -c -G -t 4 eagle_12.yml -v 1 always gets stuck on step 43. We end up with an un-balanced number of send-recv on that step. With one node having an extra un-matched recv blocking the calculation. Note that this is not directly after a rebuild but that it may involve cells that have not had any action performed on them since a rebuild.

I have looked a the obvious things that would prevent the task activation mechanism from making a correct symmetric decision but unsuccessfully thus far. Will come back to this in a few days when other commitments have passed. As always, any comments or suggestions are welcome.

Edited by Matthieu Schaller

Merge request reports