In the largest calculation of its type, Max Planck astronomers and their colleagues within an international consortium have used a T3E supercomputer to study how the matter in a large fraction of the whole observable Universe evolved into the complex web of walls and filaments discovered in recent galaxy maps.
As the calculational speed and memory capacity of computers has increased it has become possible to simulate the growth of cosmic structure in more and more detail and over larger and larger regions of space. In the last few weeks an international consortium of astronomers has used the 688-processor CRAY T3E parallel supercomputer at the Garching Computing Centre of the Max-Planck-Society to follow the formation of a cosmic web of clusters, filaments and voids in a region comparable in size to the entire observable Universe, the so-called Hubble Volume. This is the largest such computer simulation ever carried out. The simulated volume is a hundred times bigger than even the most ambitious of planned surveys of the real Universe. It provides a preview of the largest structures that may be discovered as these mapping projects are carried out.
Our Universe was very smooth at the early times that we observe directly in the Cosmic Microwave Background Radiation. The weak pattern of ripples seen in this radiation is consistent with predictions of ``inflationary'' theories for the evolution of the Universe in the first instants after the Big Bang. Astrophysicists believe that the growth of structure from this smooth state was regulated by gravity, and that most of the matter responsible for the gravitational forces is in some as yet unidentified "dark'' form. Currently, the most popular idea is that this unseen material consists of free elementary particles of a type never yet detected on Earth. Through their gravitational effects these so-called "Cold Dark Matter'' (CDM) particles are responsible for the formation of galaxies, and so, indirectly, of stars, planets and people.
Because Cold Dark Matter is influenced by gravity alone, it is possible to programme a computer to follow its motions as the universe expands, ages, and grows lumpier. In practice it is difficult to simulate the growth of clumps and voids in a very large volume; the machine must then keep track of a very large number of "pieces'' of dark matter in order to represent all the structures faithfully. The Hubble Volume calculation followed a billion pieces of CDM from the epoch observed in the background radiation until the present day. It started from the kind of near-uniform state predicted by the inflationary theories, and its goal was to see whether these theories lead to a distribution of material in the present Universe which is consistent with the patterns found in the largest existing (and planned) maps of the galaxy distribution.
This calculation used the full memory capacity of the Garching T3E, one of the top 10 most powerful supercomputers in the world. It required a year of preparation to adapt the computer programmes to a parallel computer and months of effort to maximise the efficiency with which the machine stored and manipulated data. In just 72 hours execution the calculation produced almost a Terabyte of output data -- that is about a hundred numbers for every person on Earth and enough to fill 800 CDROMs. Handling such huge datasets is in itself a difficult task, and processing and making pictures of the results of the Hubble Volume simulation has turned out to be a major part of the computational challenge.
Setting up, carrying out and analysing such a simulation requires effort by many people, and the Virgo Consortium, the group responsible for the Garching calculation, is based in the UK and involves scientists from four countries. The original computer programmes were written by Canadian and British scientists. They were completely restructured for the massively parallel CRAY T3E and adapted for the Hubble Volume problem at the Garching Computer Centre. Scientists at the Max-Planck-Institut fuer Astrophysik took responsibility for managing the calculation and the output data. The overall design of the experiment was coordinated by an American. The results are being studied in all four countries and also in France.
Analysis of the Hubble Volume simulation is still at an early stage but already a number of striking results have emerged. Pictures of slices through the matter distribution show patterns rather like those on the bottom of a sunlit swimming-pool. Walls and filaments of dark matter are seen which are so big that they would barely fit within the largest existing three-dimensional galaxy surveys. The largest low-density regions are so big that huge voids in the galaxy distribution seem certain to show up in the next generation of mapping projects. The most massive clumps of dark matter contain several times more matter than the biggest known galaxy cluster, suggesting that truly giant clusters remain to be discovered.
Computers are now so powerful that calculations like the Hubble Volume simulation can provide pictures of artificial universes with as much structural detail as would be found in a galaxy survey of our entire observable Universe. Of course, only comparison with a real survey will tell us if these pictures, and so the physical assumptions on which they are based, are a true representation of reality.