Commit 83bedfc4 authored by Peter W. Draper's avatar Peter W. Draper
Browse files

Merge remote-tracking branch 'origin/master' into long-cmdline-arguments

parents 906e1324 f1b82ec2
......@@ -66,7 +66,7 @@ TODO
How to Implement a New Cooling
------------------------------
The developper should provide at least one function for:
The developer should provide at least one function for:
* writing the cooling name in HDF5
* cooling a particle
* the maximal time step possible
......
......@@ -17,7 +17,7 @@ give a short overview of the potentials that are implemented in the code:
1. No potential (none)
2. Point mass potential (point-mass): classical point mass, can be placed at
a position with a mass.
3. Plummer potential (point-mass-softened): in the code a softended point mass
3. Plummer potential (point-mass-softened): in the code a softened point mass
corresponds to a Plummer potential, can be placed at a position with a mass.
4. Isothermal potential (isothermal): An isothermal potential which corresponds
to a density profile which is :math:`\propto r^{-2}` and a potential which is
......@@ -28,7 +28,7 @@ give a short overview of the potentials that are implemented in the code:
:math:`\Phi(r) = - \frac{GM}{r+a}.`
The free paramters of Hernquist potential are mass, scale length,
The free parameters of Hernquist potential are mass, scale length,
and softening. The potential can be set at any position in the box.
6. NFW potential (nfw): The most used potential to describe dark matter halos, the
potential is given by:
......@@ -36,7 +36,7 @@ give a short overview of the potentials that are implemented in the code:
:math:`\Phi(r) = - \frac{4\pi G \rho_0 R_s^3}{r} \ln \left( 1+
\frac{r}{R_s} \right).`
This potential has as free paramters the concentration of the DM halo, the
This potential has as free parameters the concentration of the DM halo, the
virial mass (:math:`M_{200}`) and the critical density.
7. Sine wave (sine-wave)
8. Point mass ring (point-mass-ring)
......
......@@ -45,6 +45,6 @@ Several cooling implementations (including GRACKLE) are available.
Many external potentials are available for use with SWIFT. You can choose
between them at compile time. Some examples include a central potential, a
softened central potential, and a sinusoidal potential. You will need to
configure, for example, the mass in your parameterfile at runtime.
configure, for example, the mass in your parameter file at runtime.
......@@ -9,4 +9,4 @@ and the other ``swift_mpi``. Current wisdom is to run ``swift`` if you are only
using one node (i.e. without any interconnect), and one MPI rank per NUMA
region using ``swift_mpi`` for anything larger. You will need some GADGET-2
HDF5 initial conditions to run SWIFT, as well as a compatible yaml
parameterfile.
parameter file.
......@@ -13,7 +13,7 @@ Adding Hydro Schemes
SWIFT is engineered to enable you to add your own hydrodynamics schemes easily.
We enable this through the use of header files to encapsulate each scheme.
Note that it's unlikely you will ever have to consider paralellism or 'loops over
Note that it's unlikely you will ever have to consider parallelism or 'loops over
neighbours' for SWIFT; all of this is handled by the tasking system. All we ask
for is the interaction functions that tell us how to a) compute the density
and b) compute forces.
......@@ -69,7 +69,7 @@ will need to 'fill out' the following:
+ ``hydro_compute_timestep(p, xp, hydro_props, cosmo)`` returns the timestep for
the hydrodynamics particles.
+ ``hydro_timestep_extra(p, dt)`` does some extra hydro operations once the
physical timestel for the particle is known.
physical timestep for the particle is known.
+ ``hydro_init_part(p, hydro_space)`` initialises the particle in preparation for
the density calculation. This essentially sets properties, such as the density,
to zero.
......
......@@ -10,7 +10,7 @@ GIZMO-Like Scheme
:caption: Contents:
There is a meshless finite volume (MFV) GIZMO-like scheme implemented in SWIFT
There is a mesh-less finite volume (MFV) GIZMO-like scheme implemented in SWIFT
(see Hopkins 2015 for more information). You will need a Riemann solver to run
this, and configure as follows:
......@@ -19,7 +19,7 @@ this, and configure as follows:
./configure --with-hydro="gizmo-mfv" --with-riemann-solver="hllc"
We also have the meshless finite mass (MFM) GIZMO-like scheme. You can select
We also have the mesh-less finite mass (MFM) GIZMO-like scheme. You can select
this at compile-time with the following configuration flags:
.. code-block:: bash
......
......@@ -22,7 +22,7 @@ place in SWIFT. A single file can contain any number of particles (well... up to
compute node.
The original GADGET-2 file format only contains 2 types of particles: gas
particles and 5 sorts of collisionless particles that allow users to run with 5
particles and 5 sorts of collision-less particles that allow users to run with 5
separate particle masses and softenings. In SWIFT, we expand on this by using
two of these types for stars and black holes.
......@@ -39,7 +39,7 @@ You can find out more about the HDF5 format on their `webpages
Structure of the File
---------------------
There are several groups that contain 'auxilliary' information, such as
There are several groups that contain 'auxiliary' information, such as
``Header``. Particle data is placed in separate groups depending of the type of
the particles. Some types are currently ignored by SWIFT but are kept in the
file format for compatibility reasons.
......@@ -102,7 +102,7 @@ In the ``/Header/`` group, the following attributes are required:
``NumPart_Total`` to be >2^31, the use of ``NumPart_Total_HighWord`` is only
here for compatibility reasons.
+ ``Flag_Entropy_ICs``, a historical value that tells the code if you have
included entropy or internal energy values in your intial conditions files.
included entropy or internal energy values in your initial conditions files.
Acceptable values are 0 or 1. We recommend using internal energies over
entropy in the ICs and hence have this flag set to 0.
......@@ -147,7 +147,7 @@ individual particle type (e.g. ``/PartType0/``) that have the following *dataset
+ ``Masses``, an array of length N that gives the masses of the particles.
For ``PartType0`` (i.e. particles that interact through hydro-dynamics), you will
need the following auxilliary items:
need the following auxiliary items:
+ ``SmoothingLength``, the smoothing lengths of the particles. These will be
tidied up a bit, but it is best if you provide accurate numbers. In
......@@ -169,7 +169,7 @@ h-free quantities. Switching this parameter on will also affect the box size
read from the ``/Header/`` group (see above).
Similarly, GADGET cosmological ICs have traditionally used velocities expressed
as peculiar velocities divided by ``sqrt(a)``. This can be undone by swicthing
as peculiar velocities divided by ``sqrt(a)``. This can be undone by switching
on the parameter ``InitialConditions:cleanup_velocity_factors`` in the
:ref:`Parameter_File_label`.
......
......@@ -24,7 +24,7 @@ Comments can be inserted anywhere and start with a hash:
.. code:: YAML
# Descrption of the physics
# Description of the physics
viscosity_alpha: 2.0
dt_max: 1.5 # seconds
......@@ -113,7 +113,7 @@ schemes that make use of the unit of electric current. There is also
no incentive to use anything else than Kelvin but that makes the whole
system consistent with any possible unit system.
If one is interested in using the more humourous `FFF unit
If one is interested in using the more humorous `FFF unit
system <https://en.wikipedia.org/wiki/FFF_system>`_ one would use
.. code:: YAML
......@@ -133,8 +133,8 @@ Cosmology
---------
When running a cosmological simulation, the section ``Cosmology`` sets the values of the
cosmological model. The epanded :math:`\Lambda\rm{CDM}` parameters governing the
background evolution of the Univese need to be specified here. These are:
cosmological model. The expanded :math:`\Lambda\rm{CDM}` parameters governing the
background evolution of the Universe need to be specified here. These are:
* The reduced Hubble constant: :math:`h`: ``h``,
* The matter density parameter :math:`\Omega_m`: ``Omega_m``,
......@@ -146,7 +146,7 @@ The last parameter can be omitted and will default to :math:`\Omega_r = 0`. Note
that SWIFT will verify on start-up that the matter content of the initial conditions
matches the cosmology specified in this section.
This section als specifies the start and end of the simulation expressed in
This section also specifies the start and end of the simulation expressed in
terms of scale-factors. The two parameters are:
* Initial scale-factor: ``a_begin``,
......@@ -157,7 +157,7 @@ state of dark energy :math:`w(a)`. We use the evolution law :math:`w(a) =
w_0 + w_a (1 - a)`. The two parameters in the YAML file are:
* The :math:`z=0` dark energy equation of state parameter :math:`w_0`: ``w_0``
* The dark energy equation of state evolutio parameter :math:`w_a`: ``w_a``
* The dark energy equation of state evolution parameter :math:`w_a`: ``w_a``
If unspecified these parameters default to the default
:math:`\Lambda\rm{CDM}` values of :math:`w_0 = -1` and :math:`w_a = 0`.
......@@ -179,13 +179,13 @@ use the following parameters:
w_0: -1.0 # (Optional)
w_a: 0. # (Optional)
When running a non-cosmological simulation (i.e. without the ``-c`` runtime
When running a non-cosmological simulation (i.e. without the ``-c`` run-time
flag) this section of the YAML file is entirely ignored.
Gravity
-------
The behaviour of the self-gravity solver can be modifed by the parameters
The behaviour of the self-gravity solver can be modified by the parameters
provided in the ``Gravity`` section. The theory document puts these parameters into the
context of the equations being solved. We give a brief overview here.
......@@ -206,7 +206,7 @@ The last tree-related parameter is
* The tree rebuild frequency: ``rebuild_frequency``.
Thqe tree rebuild frequency is an optional parameter defaulting to
The tree rebuild frequency is an optional parameter defaulting to
:math:`0.01`. It is used to trigger the re-construction of the tree every time a
fraction of the particles have been integrated (kicked) forward in time.
......@@ -219,12 +219,12 @@ Particle-Mesh part of the calculation. The last three are optional:
* The scale above which the short-range forces are assumed to be 0 (in units of
the mesh cell-size multiplied by :math:`a_{\rm smooth}`) :math:`r_{\rm
cut,max}`: ``r_cut_max`` (default: ``4.5``),
* The scale bewlo which the short-range forces are assumed to be exactly Newtonian (in units of
* The scale below which the short-range forces are assumed to be exactly Newtonian (in units of
the mesh cell-size multiplied by :math:`a_{\rm smooth}`) :math:`r_{\rm
cut,min}`: ``r_cut_min`` (default: ``0.1``),
For most runs, the default values can be used. Only the number of cells along
each axis needs to be sepcified. The remaining three values are best described
each axis needs to be specified. The remaining three values are best described
in the context of the full set of equations in the theory documents.
As a summary, here are the values used for the EAGLE :math:`100^3~{\rm Mpc}^3`
......@@ -308,7 +308,7 @@ Whilst for a cosmological run, one would need:
Initial Conditions
------------------
This ``IntialConditions`` section of the parameter file contains all the options related to
This ``InitialConditions`` section of the parameter file contains all the options related to
the initial conditions. The main two parameters are
* The name of the initial conditions file: ``file_name``,
......@@ -410,15 +410,15 @@ this mechanism is driven by the options in the ``Restarts`` section of the YAML
parameter file. All the parameters are optional but default to values that
ensure a reasonable behaviour.
* Wether or not to enable the dump of restart files: ``enable`` (default:
* Whether or not to enable the dump of restart files: ``enable`` (default:
``1``).
This parameter acts a master-switch for the check-pointing capabilities. All the
other options require the ``enable`` parameter to be set to ``1``.
* Wether or not to save a copy of the previous set of check-pointing files:
* Whether or not to save a copy of the previous set of check-pointing files:
``save`` (default: ``1``),
* Wether or not to dump a set of restart file on regular exit: ``onexit``
* Whether or not to dump a set of restart file on regular exit: ``onexit``
(default: ``0``),
* The wall-clock time in hours between two sets of restart files:
``delta_hours`` (default: ``6.0``).
......@@ -433,7 +433,7 @@ smaller value to allow for enough time to safely dump the check-point files.
If the directory does not exist, SWIFT will create it. When resuming a run,
SWIFT, will look for files with the name provided in the sub-directory specified
here. The files themselves are named ``basename_000001.rst`` where the basenme
here. The files themselves are named ``basename_000001.rst`` where the basename
is replaced by the user-specified name and the 6-digits number corresponds to
the MPI-rank. SWIFT writes one file per MPI rank. If the ``save`` option has
been activated, the previous set of restart files will be named
......@@ -490,7 +490,7 @@ Scheduler
Domain Decomposition
--------------------
.. [#f1] The thorough reader (or overly keen SWIFT tester) would find that the speed of light is :math:`c=1.8026\times10^{12}\,\rm{fur}\,\rm{ftn}^{-1}`, Newton's contant becomes :math:`G_N=4.896735\times10^{-4}~\rm{fur}^3\,\rm{fir}^{-1}\,\rm{ftn}^{-2}` and Planck's constant turns into :math:`h=4.851453\times 10^{-34}~\rm{fur}^2\,\rm{fir}\,\rm{ftn}^{-1}`.
.. [#f1] The thorough reader (or overly keen SWIFT tester) would find that the speed of light is :math:`c=1.8026\times10^{12}\,\rm{fur}\,\rm{ftn}^{-1}`, Newton's constant becomes :math:`G_N=4.896735\times10^{-4}~\rm{fur}^3\,\rm{fir}^{-1}\,\rm{ftn}^{-2}` and Planck's constant turns into :math:`h=4.851453\times 10^{-34}~\rm{fur}^2\,\rm{fir}\,\rm{ftn}^{-1}`.
.. [#f2] which would translate into a constant :math:`G_N=1.5517771\times10^{-9}~cm^{3}\,g^{-1}\,s^{-2}` if expressed in the CGS system.
.. Parameter File
Loic Hausammann, 1 june 2018
Loic Hausammann, 1 June 2018
.. _Output_list_label:
......@@ -53,7 +53,7 @@ default all fields are written.
This field is mostly used to remove unnecessary output by listing them
with 0's. A classic use-case for this feature is a DM-only simulation
(pure n-body) where all particles have the same mass. Outputing the
(pure n-body) where all particles have the same mass. Outputting the
mass field in the snapshots results in extra i/o time and unnecessary
waste of disk space. The corresponding section of the ``yaml``
parameter file would look like this::
......
......@@ -67,7 +67,7 @@ wish to extract. The python snippet below should give you an idea of how to
go about doing this for the bound particles.
First, we need to extract the offset from the ``.catalog_group`` file, and
work out how many _bound_ partices are in our halo. We can do this by
work out how many _bound_ particles are in our halo. We can do this by
looking at the next offset. Then, we can ID match those with the snapshot
file, and get the mask for the _positions_ in the file that correspond
to our bound particles. (Note this requires ``numpy > 1.15.0``).
......@@ -97,7 +97,7 @@ to our bound particles. (Note this requires ``numpy > 1.15.0``).
# Again, we're done with that file.
particles_file.close()
# Now, the tricky bit. We need to create the correspondance between the
# Now, the tricky bit. We need to create the correspondence between the
# positions in the snapshot file, and the ids.
# Let's look for the dark matter particles in that halo.
......@@ -166,7 +166,7 @@ Mean Density related:
:math:`\Delta=200` based on the mean density of the Universe
(:math:`M_{200}`).
+ ``R_200mean``: The :math:`R_{200}` radius of the halo based on the
mean density ofthe Universe.
mean density of the Universe.
Virial properties:
""""""""""""""""""
......@@ -190,7 +190,7 @@ properties.
+ ``M_gas``: The gas mass in the halo.
+ ``Mass_tot``: The total mass of the halo
+ ``M_gas_30kpc``: The gas mass within 30 kpc of the halo centre.
+ ``M_gas_500c``: The gas mass of the overdensity of 500 times the critical
+ ``M_gas_500c``: The gas mass of the over-density of 500 times the critical
density
+ ``M_gas_Rvmax``: The gas mass within the maximum rotation velocity.
......@@ -232,7 +232,7 @@ NFW profile properties:
+ ``VXc_gas``, ``VYc_gas`` and ``VZc_gas`` are the velocities of the gas in
the centre of the halo [#check]_.
Intertia Tensor properties:
Inertia Tensor properties:
"""""""""""""""""""""""""""
+ ``eig_ij``: Are the normalized eigenvectors of the inertia tensor.
......
......@@ -45,7 +45,7 @@ version of the code, so change ``SWIFTINTERFACE="on"`` to
Compiling VELOCIraptor
----------------------
Compoling goes completely different as compared to the on the fly halo finder
Compiling goes completely different as compared to the on the fly halo finder
configuration with SWIFT. In this case we can compile the code as::
make
......
......@@ -55,7 +55,7 @@ The VELOCIraptor algorithm consists basically of the following steps [#ref]_:
.. 1. The algorithm is mostly sensitive to substructures that are on the tail
of the Gaussian velocity density function, this means that VELOCIraptor
is most sensitive for subhalos which are cold (slow ratating) but have
is most sensitive for subhalos which are cold (slow rotating) but have
a large bulk velocity
......
......@@ -23,9 +23,9 @@ copyright = '2018, SWIFT Collaboration'
author = 'SWIFT Team'
# The short X.Y version
version = '0.7'
version = '0.8'
# The full version, including alpha/beta/rc tags
release = '0.7.0'
release = '0.8.0'
# -- General configuration ---------------------------------------------------
......
#!/bin/bash
# Run SWIFT
../swift -c -s -r -G -t 28 santa_barbara.yml
../swift -c -s -G -t 28 santa_barbara.yml
......@@ -3610,36 +3610,53 @@ void engine_makeproxies(struct engine *e) {
/* In the gravity case, check distances using the MAC. */
if (with_gravity) {
/* We don't have multipoles yet (or there CoMs) so we will have
to cook up something based on cell locations only. We hence
need an upper limit on the distance that the CoMs in those
cells could have. We then can decide whether we are too close
for an M2L interaction and hence require a proxy as this pair
of cells cannot rely on just an M2L calculation. */
/* Minimal distance between any two points in the cells */
const double min_dist_centres2 = cell_min_dist2_same_size(
&cells[cid], &cells[cjd], periodic, dim);
/* Let's now assume the CoMs will shift a bit */
const double min_dist_CoM =
sqrt(min_dist_centres2) - 2. * delta_CoM;
const double min_dist_CoM2 = min_dist_CoM * min_dist_CoM;
/* Are we beyond the distance where the truncated forces are 0
* but not too far such that M2L can be used? */
if (periodic) {
if ((min_dist_CoM2 < max_mesh_dist2) &&
(!gravity_M2L_accept(r_max, r_max, theta_crit2,
min_dist_CoM2)))
proxy_type |= (int)proxy_cell_type_gravity;
/* First just add the direct neighbours. Then look for
some further out if the opening angle demands it */
/* This is super-ugly but checks for direct neighbours */
/* with periodic BC */
if (((abs(i - iii) <= 1 || abs(i - iii - cdim[0]) <= 1 ||
abs(i - iii + cdim[0]) <= 1) &&
(abs(j - jjj) <= 1 || abs(j - jjj - cdim[1]) <= 1 ||
abs(j - jjj + cdim[1]) <= 1) &&
(abs(k - kkk) <= 1 || abs(k - kkk - cdim[2]) <= 1 ||
abs(k - kkk + cdim[2]) <= 1))) {
proxy_type |= (int)proxy_cell_type_gravity;
} else {
if (!gravity_M2L_accept(r_max, r_max, theta_crit2,
min_dist_CoM2))
proxy_type |= (int)proxy_cell_type_gravity;
/* We don't have multipoles yet (or there CoMs) so we will
have to cook up something based on cell locations only. We
hence need an upper limit on the distance that the CoMs in
those cells could have. We then can decide whether we are
too close for an M2L interaction and hence require a proxy
as this pair of cells cannot rely on just an M2L
calculation. */
/* Minimal distance between any two points in the cells */
const double min_dist_centres2 = cell_min_dist2_same_size(
&cells[cid], &cells[cjd], periodic, dim);
/* Let's now assume the CoMs will shift a bit */
const double min_dist_CoM =
sqrt(min_dist_centres2) - 2. * delta_CoM;
const double min_dist_CoM2 = min_dist_CoM * min_dist_CoM;
/* Are we beyond the distance where the truncated forces are 0
* but not too far such that M2L can be used? */
if (periodic) {
if ((min_dist_CoM2 < max_mesh_dist2) &&
(!gravity_M2L_accept(r_max, r_max, theta_crit2,
min_dist_CoM2)))
proxy_type |= (int)proxy_cell_type_gravity;
} else {
if (!gravity_M2L_accept(r_max, r_max, theta_crit2,
min_dist_CoM2))
proxy_type |= (int)proxy_cell_type_gravity;
}
}
}
......
This diff is collapsed.
......@@ -23,6 +23,7 @@
/* Includes. */
#include "common_io.h"
#include "dump.h"
#include "inline.h"
#include "timeline.h"
#include "units.h"
......@@ -73,53 +74,32 @@ struct engine;
*/
/* Some constants. */
enum logger_masks {
logger_mask_x = (1 << 0),
logger_mask_v = (1 << 1),
logger_mask_a = (1 << 2),
logger_mask_u = (1 << 3),
logger_mask_h = (1 << 4),
logger_mask_rho = (1 << 5),
logger_mask_consts = (1 << 6),
logger_mask_timestamp = (1 << 7),
enum logger_masks_number {
logger_x = 0,
logger_v = 1,
logger_a = 2,
logger_u = 3,
logger_h = 4,
logger_rho = 5,
logger_consts = 6,
logger_timestamp = 7, /* expect it to be before count */
logger_count_mask = 8, /* Need to be the last */
} __attribute__((packed));
struct mask_data {
/* Number of bytes for a mask */
int size;
/* Mask value */
unsigned int mask;
/* name of the mask */
char name[100];
};
extern const struct mask_data logger_mask_data[logger_count_mask];
/* Size of the strings. */
#define logger_string_length 200
/* parameters of the logger */
struct logger_parameters {
/* size of a label in bytes */
size_t label_size;
/* size of an offset in bytes */
size_t offset_size;
/* size of a mask in bytes */
size_t mask_size;
/* size of a number in bytes */
size_t number_size;
/* size of a data type in bytes */
size_t data_type_size;
/* number of different mask */
size_t number_mask;
/* value of each masks */
size_t *masks;
/* data size of each mask */
size_t *masks_data_size;
/* label of each mask */
char *masks_name;
/* Size of a chunk if every mask are activated */
size_t total_size;
};
/* structure containing global data */
struct logger {
/* Number of particle steps between dumping a chunk of data */
......@@ -128,8 +108,8 @@ struct logger {
/* Logger basename */
char base_name[logger_string_length];
/* File name of the dump file */
struct dump *dump;
/* Dump file */
struct dump dump;
/* timestamp offset for logger*/
size_t timestamp_offset;
......@@ -137,8 +117,8 @@ struct logger {
/* scaling factor when buffer is too small */
float buffer_scale;
/* logger parameters */
struct logger_parameters *params;
/* Size of a chunk if every mask are activated */
int max_chunk_size;
} SWIFT_STRUCT_ALIGN;
......@@ -151,18 +131,6 @@ struct logger_part_data {
size_t last_offset;
};
enum logger_datatype {
logger_data_int,
logger_data_float,
logger_data_double,
logger_data_char,
logger_data_longlong,
logger_data_bool,
logger_data_count /* should be last */
};
extern const unsigned int logger_datatype_size[];
/* Function prototypes. */
int logger_compute_chunk_size(unsigned int mask);
void logger_log_all(struct logger *log, const struct engine *e);
......@@ -183,9 +151,6 @@ int logger_read_gpart(struct gpart *p, size_t *offset, const char *buff);
int logger_read_timestamp(unsigned long long int *t, double *time,
size_t *offset, const char *buff);
void logger_parameters_init(struct logger_parameters *log_params);
void logger_parameters_clean(struct logger_parameters *log_params);
/**
* @brief Initialize the logger data for a particle.
*
......@@ -193,7 +158,7 @@ void logger_parameters_clean(struct logger_parameters *log_params);
*/
INLINE static void logger_part_data_init(struct logger_part_data *logger) {
logger->last_offset = 0;
logger->steps_since_last_output = SHRT_MAX;
logger->steps_since_last_output = INT_MAX;
}
/**
......
......@@ -2985,9 +2985,13 @@ void runner_do_logger(struct runner *r, struct cell *c, int timer) {
/* Write particle */
/* Currently writing everything, should adapt it through time */
logger_log_part(e->logger, p,
logger_mask_x | logger_mask_v | logger_mask_a |
logger_mask_u | logger_mask_h | logger_mask_rho |
logger_mask_consts,
logger_mask_data[logger_x].mask |
logger_mask_data[logger_v].mask |
logger_mask_data[logger_a].mask |
logger_mask_data[logger_u].mask |
logger_mask_data[logger_h].mask |
logger_mask_data[logger_rho].mask |
logger_mask_data[logger_consts].mask,
&xp->logger_data.last_offset);
/* Set counter back to zero */
......
......@@ -1067,13 +1067,13 @@ void DOPAIR1_BRANCH_STARS(struct runner *r, struct cell *ci, struct cell *cj) {
#ifdef SWIFT_DEBUG_CHECKS
if (do_ci) {
RUNNER_CHECK_SORT_STARS(hydro, part, cj, ci, sid);
RUNNER_CHECK_SORT_STARS(stars, spart, ci, cj, sid);
RUNNER_CHECK_SORT(hydro, part, cj, ci, sid);
RUNNER_CHECK_SORT(stars, spart, ci, cj, sid);
}
if (do_cj) {
RUNNER_CHECK_SORT_STARS(hydro, part, ci, cj, sid);
RUNNER_CHECK_SORT_STARS(stars, spart, cj, ci, sid);
RUNNER_CHECK_SORT(hydro, part, ci, cj, sid);
RUNNER_CHECK_SORT(stars, spart, cj, ci, sid);
}
#endif /* SWIFT_DEBUG_CHECKS */
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment