Skip to content
Snippets Groups Projects
Commit eb75ffee authored by Josh Borrow's avatar Josh Borrow
Browse files

Initial commit

parents
No related branches found
No related tags found
No related merge requests found
File added
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>defaultPathExtensions</key>
<string>md</string>
<key>enforceFencedCodeBlocks</key>
<false/>
<key>textFileType</key>
<string>Markdown</string>
<key>useInlineLinks</key>
<false/>
</dict>
</plist>
# Configuring SWIFT
SWIFT uses GNU [`autotools`][1] to configure compile-time options. To generate the configure script, you will need to run the `autogen` script, as follows:
```bash
./autogen.sh
```
There is a huge list of configuration options, which are documented online [here][2] as well as in the configuration script itself, with `./configure --help`. To configure the most basic version of SWIFT, which uses all of the defaults, you can simply write:
```bash
./configure
```
which, as of December 2018, will give you
+ Gadget-2 compatible SPH (i.e. Density-Entropy with a Balsara switch and a constant artificial viscosity)
+ A cubic spline kernel
+ 4th order FMM gravity solver
+ No external potential
+ No star formation
+ No cooling model
+ No feedback model
This configuration script should find all of the libraries, compilers, and flags that you require to build the code.
# Building SWIFT
Now that we have configured the code, we will have to build it. We can do this with a call to `make`, and build in parallel with the `-j` option,
```bash
make -j 8
```
which will build SWIFT with 8 threads in parallel (this should be quite fast).
Building SWIFT will create _two_ binaries in the usual case, `examples/swift` (the non-MPI version, use this if you are only running on one node), and `examples/swift_mpi` (required to use MPI). In the following, we will only consider the non-MPI version, as the examples that you will be running are quite small.
[1]: https://www.gnu.org/software/automake/manual/html_node/Autotools-Introduction.html
[2]: http://swift.dur.ac.uk/docs/GettingStarted/configuration_options.html
\ No newline at end of file
# Introduction
SWIFT (**S**PH **W**ith **I**nter-depdendent **F**ine-grained **T**asking) is a modern, extensible particle-based cosmological simulation code. SWIFT solves the classic combination of hydrodynamics, gravity, and cosmology to help us study the formation and evolution of the universe.
SWIFT is a hybrid MPI + threads code (note that it does _not_ use OpenMP), meaning that you will only need one MPI rank _per node_ rather than the traditional MPI rank _per core_. We’ll come onto this more later, but that is something that is important to note at this time. Also, if you are compiling on a laptop, you can disable MPI (it is not actually required to build SWIFT; use `--disable-mpi` at configure time), meaning that you don’t need an MPI library on your laptop.
There is a complete set of documentation available on our website, [swiftsim.com][1], where you can also download the code. This introductory guide will just take you through the basics of configuring, building, and running the code.
Let’s start by cloning the SWIFT repository, and moving into it:
```bash
git clone https://gitlab.cosma.dur.ac.uk/swift/swiftsim.git
cd swiftsim
```
[1]: http://www.swiftsim.com
\ No newline at end of file
# Running a More Interesting Example
The shock tube is fun, but not a particularly revealing test - it only uses hydrodynamics. Let’s look at a more involved example, a small cosmological volume.
## A Small Cosmological Volume
The small cosmological volume is a small cosmology simulation that has a total size of 142 Mpc on a side with 262144 dark matter particles. To get to this example running first go back to the example folder:
```bash
cd ..
```
Now go to the folder of the small cosmological simulation example which only uses dark matter:
```bash
cd SmallCosmoVolume_DM
```
To run the example we first have to fetch the initial conditions, to do this run the `getIC` script as:
```bash
./getIC.sh
```
To run the simulation with both cosmology and self-gravity invoke SWIFT as follows:
```bash
./swift -c -G -t 8 small_cosmo_volume.yml
```
* `-c` means "run a cosmological simulation"
* `-G` means "run with self-gravity"
And the same as the previous example:
* `-t 8` means "run with 8 threads"
* `small_cosmo_volume.yml` is the parameter file that contains units, ICs, etc.
Running this example might take somewhere between 20 and 40 minutes on a laptop. If you would like to run this through a batch system, please see the next chapter.
Unlike Gadget, SWIFT doesn't include ‘little h’ in the input and output units. In this example we use ICs that in fact use ‘little h’; in the parameter file (through `InitialConditions:cleanup_h_factors`) it is possible to specify that the ICs were originally generated for Gadget. This way the ICs are converted at startup and the output files will use the SWIFT convention.
Once the simulation has completed, you can view the snapshots with a number of packages, such as `yt` or `py-sphviewer` in `python`, or Gadget Viewer.
## A Small Cosmological Volume with SPH
Our previous small cosmological volume simulation only contained dark matter. To make this more simulation more interesting, let’s add baryons! For this we can go to the folder of the SPH version of the small cosmological volume:
```bash
cd ../SmallCosmoVolume
```
Again we have to download the ICs:
```bash
./getIC.sh
```
If this is not fast enough you could also copy the ICs of the previous example to the current folder. Similar to the dark matter only small cosmological volume the small cosmological volume with gas can be run as:
```bash
./swift -c -s -G -t 8 small_cosmo_volume.yml
```
* `-s` means "run with SPH"
Because the initial conditions are dark matter only Gadget initial conditions, the program will first clean up the h-factors and after that it will generate gas and dark matter particles from the IC file with the centre of mass at the
original position in the IC file. This is specified in the parameter file with `InitialConditions: generate_gas_in_ics`.
Running the example will take slightly longer than the dark matter only simulation.
With more physics comes more parameters. Inspect for yourself which additional parameters are added to the parameter file for a hydro run to get a feeling of how it works.
Now that we have baryons, we can look at the temperature evolution of the volume. There is a plotting script in the directory that can be invoked as follows:
```bash
python plotTempEvolution.py
```
or
```bash
python3 plotTempEvolution.py
```
Depending on how your python 3 is configured.
## A Small Cosmological Volume with SPH and cooling
So far the temperature of the gas in our small cosmological simulation has only been allowed to increase from shock heating. The next logical step will be to add cooling to the simulation. To do this we need to reconfigure the code and recompile. To do this we can configure with the simplest cooling model, the constant Lambda cooling. Go back to the `swiftsim` directory and type:
```bash
./configure --with-cooling=const-lambda
```
After this the we again need to build the code by using:
```bash
make -j 8
```
To run the small cosmological simulation by using cooling go to the following example folder:
```bash
cd examples/SmallCosmoVolume_cooling
```
To run the example first again download the initial conditions by using:
```bash
./getIC.sh
```
Or copy them from the previous small cosmological volume example. To run with cooling:
```bash
./swift -c -s -G -C -t 8 small_cosmo_volume.ym
```
With the `-C` added which means run with cooling.
Compared to the previous example the parameter file is almost identical with the addition of an additional field that specifies the value of the cooling.
Because we have added more physics, and we now get colder, denser, gas, running the code might take slightly longer.
Take another look at the temperature evolution of the box:
```bash
python plotTempEvolution.py
```
Compare this with the model without cooling.
## A Small Cosmological Volume with SPH, Cooling and VELOCIraptor
So far we have run the code with SPH and cooling. We can now include an on-the-fly halo finder (not just a 3D FoF, but a full halo catalogue) with VELOCIraptor. VELOCIraptor can be run on the fly between snapshots, this saves reading time and allows for a faster analysis of the simulation outputs. This section of the hack session is only available if you are running on COSMA, in this case we have a precompiled library of VELOCIraptor stored.
Using on the fly halo finding means we need to add additional configuration options, so to
configure SWIFT in the `swiftsim` directory we need to use:
```bash
./configure --with-cooling=const-lambda --with-velociraptor=PATH
```
This will setup swift such that after each snapshot it passes the data to VELOCIraptor.
Build the code by using:
```bash
make -j 8
```
Go into the the directory of the previous example, to run use:
```bash
export OMP_NUM_THREADS=16
../swift -c -s -G -x -t 16 small_cosmo_volume.yml
```
+ The first line sets VELOCIraptor to use 16 threads in parallel
+ `-x` means run with structure finder, so activate VELOCIraptor.
After this simulation has produced a snapshot you will have the first halo finder results which are given in 6 files. Documentation describing the contents of the files can be
found on our website.
# Running in a Batch System
In this section we’ll look at running SWIFT in a batch system. This is useful when you have a long-running job, or one that you need MPI for. They also allow resources to be shared fairly between many users, and stop people attempting to all run on the same node at once, causing a crash.
We’ll only consider the SLURM batch system here, but the ideas should be transferrable elsewhere. Let’s use the `SmallCosmoVolume` example from above. Change to that directory:
```bash
cd examples/SmallCosmoVolume
```
To submit to a batch system, you will need a ‘batch script’. This is a little shell script with some comments at the top that tell the batch system some information about your job:
```bash
#!/bin/bash
#SBATCH -J JobName
#SBATCH -N NumberOfNodes
#SBATCH -o out_%j.out
#SBATCH -e err_%j.err
#SBATCH -p Partition
#SBATCH -A Group
#SBATCH --exclusive
#SBATCH -t hh:mm:ss
# Now we can run the actual program
../swift -c -s -G -t 16 small_cosmo_volume.yml
```
Lines that start with `#SBATCH` are picked up by the batch system. The reason these are comments is so that if you were to run the shell script (e.g. `bash my_submission.slurm`) then you would still be able to run without errors arising from extra syntax in the batch system lines.
For the above batch script, the parameters define the following:
+ `-J`: the job name; this can be anything. This is useful for differentiating your job.
+ `-N`: the number of nodes that you wish to request from the batch system.
+ `-o` and `-e`: the names of your output (stdout) and error (stderr) files respectively. `%j` returns the unique job ID.
+ `-p`: the partition you wish to run on. To see the available partitions you can type `sinfo` or contact your system manager.
+ `-A`: the group you would like to submit the job on behalf of. Contact your system manager for more information.
+ `--exclusive`: this requests exclusive access to the nodes that you are running on.
+ `-t`: the estimated run time of your job. It is important to set this as low as possible (but not too low as your job will be killed) to ensure that your job is picked up by the queue system as quickly as possible.
## Submitting your jobs
To submit your job, save your batch script to a file (for example `submit.slurm`. Then, to submit it, use the `sbatch` command:
```bash
sbatch submit.slurm
```
Then to view your currently running/pending jobs, you can use the `squeue` command,
```bash
squeue -u YourUsername
```
There are many more parameters that can be passed to `squeue`, use `man squeue` to find out more.
### Getting an estimated start time
You can find out much more information about your job and the compute nodes that it is running on by using the `scontrol` command. In the output of `squeue`, you will have seen the unique ID of your job (`JOBID`). To view information about that job, use
```bash
scontrol show job JOBID
```
This will give you a large list of information, including an estimated start time (your job will always start before or on this time, depending on how well people are estimating the runtime of their jobs) should your job not yet be running.
### Cancelling a job
To cancel a job you will need to use the `scancel` command and the `JOBID` from before,
```bash
scancel JOBID
```
## Running over MPI
To run over MPI you will need to request more than one node, and run the MPI version of the SWIFT binary. The batch script below shows how to run the `EAGLE_50` test over 4 nodes, with one MPI rank per node on a system with 28 physical cores per node (the COSMA-7 system in Durham).
```bash
#!/bin/bash
#SBATCH -N 4
#SBATCH -o outfile-%j.o
#SBATCH -e errfile-%j.e
#SBATCH --exclusive
#SBATCH --tasks-per-node=1
#SBATCH --cpus-per-task=28
#SBATCH -J SWIFT_EAGLE_50
#SBATCH -p partition
#SBATCH -A group
#SBATCH -t 1:00:00
mpirun -np 4 ../swift_mpi -c -s -S -G -n 4096 -t 28 eagle_50.yml
```
+ `#SBATCH --tasks-per-node` gives the number of MPI ranks per node
+ `#SBATCH --cpus-per-task` gives the number of threads to activate per MPI rank
+ `-np 4` states to use 4 MPI ranks
+ `-S` means to run SWIFT with stars
+ `-n 4096` means to run SWIFT for a fixed number of (4096) steps
# Running your First Example
For your first example, it’s worth running a simple, hydro-only case. We’ll run the classic Sod Shock Tube, in 3D, using the binary we just made. SWIFT comes with a huge number of examples in the repository, so it’s worth exploring them when you get the chance. For now,
```bash
cd examples/SodShock_3D
```
We’ll need to generate the initial conditions, which require a glass:
```bash
./getGlass.sh
python makeIC.py
```
To run the example (which will take a minute or so, on a laptop) you can use the shell script that’s included,
```bash
./run.sh
```
Let’s take a closer look at the line in there that actually runs SWIFT. SWIFT has a lot of command-line arguments that you will need to understand if you wish to run production simulations.
Here we have
```bash
../swift -s -t 4 sodShock.yml
```
+ `-s` means “run with SPH”
+ `-t 4` means “run with 4 threads”
+ `sodShock.yml` is the parameter file that contains things like units, the CFL parameter for the hydrodynamics, where the initial conditions file is, etc.
You can view all of the command line arguments by invoking the `swift` binary with `swith -h`. There is also extra documentation available [online][1]. For more information about the content of the parameter files, which are standard [`yaml`][2] files, you can either look at `examples/parameter_example.yml` which lists all available parameters, or see the [online documentation][3].
Now that we have the output of the simulation, we can use the included script to plot it. Run
```bash
python plotSolution.py 1
```
which will plot the solution and output `sodshock.png` which gives an overview of the hydrodynamical properties in the tube.
[1]: http://swift.dur.ac.uk/docs/CommandLineOptions/index.html
[2]: http://yaml.org/spec/
[3]: http://swift.dur.ac.uk/docs/ParameterFiles/index.html
\ No newline at end of file
# An Introduction To SWIFT
## Josh Borrow
## Folkert Nobels
### SWIFT v0.8.0
#### 13th November 2018
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment