Skip to content
Snippets Groups Projects
Commit 9f870f96 authored by Josh Borrow's avatar Josh Borrow
Browse files

Merge branch 'planetary_bits' into 'master'

Planetary bits

See merge request !16
parents a20d6ae2 5992f127
No related branches found
No related tags found
1 merge request!16Planetary bits
......@@ -56,6 +56,13 @@ people: [
role: PhD Student (astronomy),
affil: "ICC, Durham University",
},
{
name: Mr. Jacob Kegerreis,
role: PhD Student (astronomy),
affil: "ICC, Durham University",
expertise: "Planetary giant impacts",
href: "https://www.dur.ac.uk/physics/staff/profiles/?username=cklv53"
},
{
name: Mr. Joshua Borrow,
role: PhD Student (astronomy),
......
......@@ -14,7 +14,7 @@ required MPI communications. The core calculations in SWIFT use
hand-written SIMD intrinsics to process multiple particles in parallel
and achieve maximal performance.
## Strong- and weak-scaling
## Strong and weak scaling
Cosmological simulations are typically very hard to scale to large
numbers of cores, due to the fact that information is needed from each
......@@ -23,7 +23,7 @@ decomposition, vectorisation, and asynchronous communication to
provide a 36.7x speedup over the de-facto standard (the publicly
available GADGET-2 code) and near-perfect weak scaling even on
problems larger than presented in the published astrophysics
literature
literature.
![SWIFT Scaling Plot](scalingplot.png) The left panel ("Weak Scaling")
shows how the run-time of a problem changes when the number of threads
......@@ -31,7 +31,7 @@ is increased proportionally to the number of particles in the system
(i.e. a fixed 'load per thread'). The right panel ("Strong Scaling")
shows how the run-time changes for a fixed load as it is spread over
more threads. The right panel shows the 36.7x speedup that SWIFT
offers over GADGET-2. This uses a representative problem (a snapshot
offers over GADGET-2. This uses a representative problem -- a snapshot
of the [EAGLE](http://adsabs.harvard.edu/abs/2014ApJS..210...14K)
simulation at late time where the hierarchy of time-steps is very deep
and where most other codes struggle to harvest any scaling or performance.
......@@ -42,4 +42,4 @@ and where most other codes struggle to harvest any scaling or performance.
SWIFT uses the parallel-hdf5 library to read and write snapshots
efficiently on distributed file systems. Through careful tuning of
Lustre parameters, SWIFT can write snapshots at the maximal disk
writing speed of a given system.
writing speed of a given system.
......@@ -12,43 +12,49 @@ their surroundings. This turns out to be quite a complicated problem
as we can't build computers large enough to simulate everything down
to the level of individual atoms. This implies that we need to
re-think the equations that describe the matter components and how
they interact with each-others. In practice, we must solve the
they interact with each other. In practice, we must solve the
equations that describe these problems numerically, which requires a
lot of computing power and fast computer code.
We use SWIFT to run simulations of Astrophysical objects, such as
galaxies or even the whole Universe. We do this to test theories
about what the Universe is made of and evolved from the Big Bang up to
We use SWIFT to run simulations of astrophysical objects, such as
planets, galaxies, or even the whole universe. We do this to test theories
about what the universe is made of and how it evolved from the Big Bang up to
the present day!
<div class="videowrapper"><iframe width="100%" height="100%" src="https://www.youtube.com/embed/UJYV6f8SwdQ" frameborder="0" allowfullscreen></iframe>
</div>
Dwarf galaxies orbiting a galaxy similar to our own Milky Way.
<div class="videowrapper"><iframe width="100%" height="100%" src="http://icc.dur.ac.uk/giant_impacts/uranus_1e8_anim.mp4" frameborder="0" allowfullscreen></iframe>
</div>
A giant impact onto the young planet Uranus.
## Why create SWIFT?
We created SWIFT for a number of reasons. The primary reason beings
that we want to be able to simulate a whole Universe! This has been
We created SWIFT for a number of reasons. The primary reason being
that we want to be able to simulate a whole universe! This has been
done before successfully (see [the EAGLE
Project](http://icc.dur.ac.uk/Eagle) for more details), but this
Project](http://icc.dur.ac.uk/Eagle) for more details), but that
simulation used a software which is not tailored for the newest
super-computers and took almost 50 days on a very large computer to
supercomputers and took almost 50 days on a very large computer to
complete. SWIFT aims to remedy that by choosing to parallelise the
problem in a different way, by using better algorithms and by having a
more modular structure than other codes making it easier for users to
pick and choose what physical models they want to include in their
simulations.
simulations. This lets us also study very different topics like
the giant impacts of planets colliding in the early solar system.
The way that super-computers are built is not by having one huge
The way that supercomputers are built is not by having one huge
super-fast 'computer', but rather by having lots of regular computers
(only a tiny bit better than what is available at home!) that are
connected together by high-speed networks. Therefore the way to speed
(only a tiny bit better than what is available at home!) that are
connected together by high-speed networks. Therefore, the way to speed
up your code might not necessarily be to make it 'run faster' on a
single machine, but rather enable those machines to talk to each other
in a more efficient way. This is how SWIFT is different from other
codes that are used in Astrophysics for a similar purpose: the focus
codes that are used in astrophysics for a similar purpose: the focus
is on distributing the work to be done (the equations to be solved) in
the best possible way across all the small computers that are part of
a super-computer.
a supercomputer.
Traditionally, you have each 'node' (computer) in the 'cluster'
(supercomputer) running the exact same code at the exact same time,
......@@ -58,11 +64,11 @@ each node working on different tasks than others as and when those
tasks need to be completed. SWIFT also makes the nodes communicate
with each others all the time and not only at fixed points, allowing
for much more flexibility. This cuts down on the time when a node is
sitting and waiting for work, which is just wasted time, electricity
sitting and waiting for work, which is just wasted time, electricity,
and ultimately money!
One other computer technology that occured in the last decade is the
appearance of so-called vector-instructions. These allow one given
One other computer technology that occurred in the last decade is the
appearance of so-called vector instructions. These allow one given
computing core to process not just one number at a time (as in the
past) but up to 16 (or even more on some machines!) in parallel. This
means that a given compute core can solve the equations for 16 stars
......@@ -76,7 +82,7 @@ solve the same equations as other software in significantly less time!
Smoothed Particle Hydrodynamics (SPH) is a numerical method for
approximating the forces between fluid elements (gas or
liquids). Let's say that we want to simulate some water and a wave
within it. Even a single liter of water has
within it. Even a single liter of water has
100000000000000000000000000 particles in it. To store that much data
we would require a computer that as 100 trillion times as much storage
space as *all of the data on the internet*. It's clear that we need a
......@@ -87,9 +93,9 @@ It turns out that we can represent the water by many fewer particles
if we can smooth over the gaps between them efficiently. Smoothed
Particle Hydrodynamics is the technique that we use to do that.
SPH was originally developped to solve problems in astrophysics but is
SPH was originally developed to solve problems in astrophysics but is
now a popular tool in industry with applications that affect our
everyday life. Turbines are modelled with this technique to
understand how to harvest as much energy from the wind. The method is
also used to understand how waves and tsunamis affect the shores,
allowing scientists to design effective defence for the population.
allowing scientists to design effective defences for the population.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment