The digital universe that lives inside a supercomputer
Image 1 of 5
ntWhen you’re trying to model the evolution of the universe over billions of years there’s only so much you can do with brainpower and a pencil.
n
ntLuckily the researchers at the Institute for Computational Cosmology at Durham University in the UK have a 25 TeraFlops supercomputer at hand.
n
ntProfessor Carlos Frenk, director of the institute for computational cosmology at the University of Durham, said: “For the first time in the history of humanity we have the ability to build virtual universes in a computer.”
n
ntTo build a virtual universe the institute programs the computer with the fundamental laws of physics, feeds it data representing the matter and energy that made up the early universe and examines the digital galaxies the supercomputer spits out.
n
ntThe evolution of the universe is simulated from 300,000 years after the Big Bang – when the universe was practically a newborn in cosmological terms – through to today.
n
ntThe models help the institute learn about the processes that shape galaxies, such as the influence of heating and cooling and how supernova explosions influence star formation. They also simulate the influence of dark matter, matter that is invisible save for its pull on visible matter around it.
n
ntThis is one of the institute’s models of the swathe of stars, tens of thousands of light years across, enveloping a Milky Way-like galaxy.
n
ntPhoto: A.P. Cooper and J. Helly / Virgo Consortium
ntThe institute has been building digital universes inside supercomputers for a decade.
n
ntIts current supercomputer, known as the Cosmology Machine or COSMA 4, has about 3,000 processor cores and more than 13 TB of RAM. It is seven times faster than their previous supercomputer, COSMA3, and was built by high-performance computer specialist OCF and IBM.
n
ntStorage presents a headache for the organisation, – unsurprising perhaps given a computer model of the effect of dark matter on galaxy formation can produce 100TB of data in a single run. Having filled up two thirds of its 612TB storage that came with Cosma 4 the institute recently upgraded to 1.1PB. This shot shows IBM System x iDataPlex server installed racks.
n
ntOverall Cosma 4 relies on 220 IBM iDataPlex dx360 M3 servers, with a total of 2640 2.67 GHz Intel X5650 cores and 13.2TB of RAM. It also uses two IBM x 3850 machines, with 32 2GHz 7550 cores and 512MB of RAM each.
n
ntPhoto: Durham University
ntWhen studying something as vast as the universe, computer models provide one of the few ways that researchers can test their theories about the origins of the cosmos.
n
ntFrenk said: “Unlike other sciences it is very difficult to ‘test’ theories on the Universe. Brain power alone is not enough to calculate the complex algorithms.
n
nt”Computing power today is changing the way in which we do science. In my own discipline, which is cosmology and astrophysics we are completely reliant on computing, which has opened up completely new methods of looking at the universe.”
n
ntThis is a shot of IBM System x iDataPlex server cabling.
n
ntPhoto: Durham University
nt”The level of detail in the models we use is continually increasing, so there’s a perpetual need for more processor power and data storage capacity,” said Dr Lydia Heck, senior computer manager at Durham University.
n
nt”Just one of our new research projects will consume the whole processing power of COSMA4 for half of a year. 288 extra cores are very welcome because we have other smaller projects, which can be serviced alongside the major research.”
n
ntThe size and complexity of the galaxies that can be simulated by the institute is limited by the memory of the current generation of 64-bit supercomputers, and the time it takes them to process data.
n
ntTo work within these limitations computer models are simplified, such as by losing some of the detail of what is being modelled – for instance, choosing to model the formation of millions of stars at a time rather than trying to model the formation of each one of the hundreds of billions of stars within a galaxy.
n
ntTo refine the models, researchers use a process of elimination, comparing the characteristics of galaxies produced by the computer model with observations of galaxies taken by astronomers to see if the computer model is a good fit for real life.
n
ntThis shot shows Cosma 4’s storage in a rack.
n
ntPhoto: Durham University
ntA computer model showing the structure of the universe one billion years after the Big Bang, the green swirls represent dark matter and the circles represent growing galaxies.
n
ntThe institute is focusing on two main projects, one to model galaxy formation from a time soon after the Big Bang until present day, the second to model the gas outside of galaxies, particularly in clusters and groups of galaxies.
n
ntThe ultimate goal of this modelling is to produce galaxies whose characteristics match measurements of real life galaxies taken by astronomers, and then use those models to help understand the processes that are important in producing these galaxies.
n
ntThe simulations produced by the institute have modelled certain structures in the universe with a high degree of accuracy u2013 such as the strings of diffuse matter that exist between galaxies u2013 which researchers say indicates that their models of galaxy formation are on the right track.
n
ntPhoto: Sarah Noble and Vicky Greener, department of physics, Durham University
-
Account Information
Contact Nick Heath
- |
- See all of Nick's content