Sanitize smoothing lengths
Here is a proposal to address problems such as #211 (closed).
At the start, after having read in the particles, constructed the top-level cells and all their progenitors we do a quick pass to sanitize the whole thing. If a top-level cell has more than 100000 particles (say) but has an h_max
that prevents tasks associated with this cell from being split we do the following:
- Compute the geometric mean and standard deviation of the smoothing lengths in that cell.
- Limit the smoothing lengths of all particles in this cell to mean+4sigma (say).
Recall that the cells are always split so that the scheduler will be able to push down the tasks at a later point. This procedure is applied to the top-level cells only.
Does that make sense to you ?