Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

Biggest ever cosmos simulation ~~ supercomputers used to re-create how the Universe evolved
BBC ^ | Wednesday, 1 June, 2005, 18:44 GMT 19:44 UK | staff

Posted on 06/01/2005 9:55:09 PM PDT by Ernest_at_the_Beach

Astronomers have used supercomputers to re-create how the Universe evolved into the shape it is today.

The simulation by an international team is the biggest ever attempted and shows how structures in the Universe changed and grew over billions of years.

The Millennium Run, as it is dubbed, could help explain observations made by astronomers and shed more light on the Universe's elusive dark energy field.

Details of the study appear in the latest issue of Nature magazine.

What's unique about the simulation is its scope and the level of detail

Prof Carlos Frenk, University of Durham

"We have learned more about the Universe in the last 10 or 20 years than in the whole of human civilisation," said Professor Carlos Frenk, Ogden professor of fundamental physics at the University of Durham and co-author on the Nature report.

"We are now able, using the biggest, fastest supercomputers in the world, to recreate the whole of cosmic history," he told the BBC.

The researchers looked at how the Universe evolved under the influence of the mysterious material called dark matter.

Dark matter model

According to cosmological theory, soon after the Big Bang, cold dark matter formed the first large structures in the Universe, which then collapsed under their own weight to form vast halos.

The gravitational pull of these halos sucked in normal matter, providing a focus for the formation of galaxies.

Supercomputer, BBC/Wildcat Films
Powerful supercomputers were enlisted to create the simulation

The simulation tracked some 10 billion dark matter particles over roughly 13 billion years of cosmic evolution. It incorporated data from satellite observations of the heat left over from the Big Bang, information on the make-up of the Universe and current understanding of the laws of physics on Earth.

"What's unique about the simulation is its scope and the level of detail with which we can re-create the cosmic structures we see around us," Professor Frenk commented.

English Astronomer Royal, Sir Martin Rees told the BBC: "Now we have the Millennium Run simulations, we have the predictions of the theory in enough detail that we can see if there is a meshing together of how the world looks on the larger scale and the way we expect it should look according to our theories. It's a way to check our theories."

Energy problem

Comparisons between the results of the simulation and astronomical observations are already helping shed light on some unsolved cosmic mysteries.

Some astronomers have previously questioned how radio-sources in distant galaxies called quasars could have formed so quickly after the Big Bang under the cold dark matter model.

The Millennium Run simulation demonstrates that such structures form naturally under the model in numbers consistent with data from the Sloan Digital Sky Survey.

The virtual universe may also shed light on the nature of dark energy, which makes up about 73% of the known Universe, and which, Frenk says, is the "number one unsolved problem in physics today - if not science itself".

"Our simulations tell us where to go looking for clues to learn about dark energy. If we want to learn about this we need to look at galaxy clusters, which encode information about the identity of dark energy," Professor Frenk explained.



TOPICS: Astronomy; Computers/Internet
KEYWORDS:
Navigation: use the links below to view more comments.
first 1-2021 next last

1 posted on 06/01/2005 9:55:10 PM PDT by Ernest_at_the_Beach
[ Post Reply | Private Reply | View Replies]


2 posted on 06/01/2005 9:57:28 PM PDT by Ernest_at_the_Beach (This tagline no longer operative....floated away in the flood of 2005 ,)
[ Post Reply | Private Reply | To 1 | View Replies]

To: All

Video is available at the BBC Website.


3 posted on 06/01/2005 10:01:55 PM PDT by Ernest_at_the_Beach (This tagline no longer operative....floated away in the flood of 2005 ,)
[ Post Reply | Private Reply | To 2 | View Replies]

To: KevinDavis

fyi


4 posted on 06/01/2005 10:03:12 PM PDT by Ernest_at_the_Beach (This tagline no longer operative....floated away in the flood of 2005 ,)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Ernest_at_the_Beach

Looks just a smidge like "Collusus, The Forbin project." bump, bttt.


5 posted on 06/01/2005 10:03:12 PM PDT by Not now, Not ever! (This tagline is temporarily closed for re-modeling)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Ernest_at_the_Beach; All
Cool (Free!) Astronomy-related Software:
Please FReepmail other suggestions
  • Celestia: (GET THIS ONE! -- m_f) A real-time space simulation that lets you experience our universe in three dimensions. Unlike most planetarium software, Celestia doesn't confine you to the surface of the Earth. You can travel throughout the solar system, to any of over 100,000 stars, or even beyond the galaxy. All travel in Celestia is seamless; the exponential zoom feature lets you explore space across a huge range of scales, from galaxy clusters down to spacecraft only a few meters across. A 'point-and-goto' interface makes it simple to navigate through the universe to the object you want to visit.
  • Sky Screen Saver: Shows the sky above any location on Earth, including stars (from the Yale Bright Star Catalogue of more than 9000 stars to the 7th magnitude), the Moon in its correct phase and position in the sky, and the position of the Sun and all the planets in the sky.
    Outlines, boundaries, and names of constellations can be displayed, as well as names and Bayer/Flamsteed designations of stars brighter than a given threshold. A database of more than 500 deep-sky objects, including all the Messier objects and bright NGC objects can be plotted to a given magnitude. The ecliptic and celestial equator can be plotted, complete with co-ordinates.
  • Home Planet: A comprehensive astronomy / space / satellite-tracking package for Microsoft Windows 95/98/Me and Windows NT 4.0/2000/XP and above. Selected features:
    • An earth map, showing day and night regions, location of the Moon and current phase, and position of a selected earth satellite. Earth maps can be customised and extended.Hposition and phase data for the Sun and Moon.
    • Panel showing positions of planets and a selected asteroid or comet, both geocentric and from the observer's location.
    • A sky map, based on either the Yale Bright Star Catalogue or the 256,000 star SAO catalogue, including rendering of spectral types, planets, earth satellites, asteroids and comets.
    • Databases of the orbital elements of 5632 asteroids and principal periodic comets are included, allowing selection of any for tracking.
    • A telescope window which can be aimed by clicking in the sky map or telescope itself, by entering coordinates, or by selecting an object in the Object Catalogue.
    • A horizon window which shows the view toward the horizon at any given azimuth.
    • Object Catalogue allows archiving images, sounds, and tabular data about celestial objects.
    • Orrery allows viewing the solar system, including a selected asteroid or comet, from any vantage point in space, in a variety of projections.
    • Satellite tracking panel. Select an Earth satellite from a database of two-line elements, and see its current position and altitude.
    • View Earth From panel allows you to view a texture-mapped image of the Earth as seen from the Sun, Moon, a selected Earth satellite, above the observing location, or the antisolar point.
    • Satellite database selection allows maintenance of multiple lists of satellites, for example TV broadcast, ham radio, low orbit, etc.
  • Cartes du Ciel Sky Charts: Enables you to draw sky charts, making use of the data in 16 catalogs of stars and nebulae. In addition the position of planets, asteroids and comets are shown.
  • SETI@Home: A scientific experiment that uses Internet-connected computers in the Search for Extraterrestrial Intelligence (SETI). You can participate by running a free program that downloads and analyzes radio telescope data.

6 posted on 06/01/2005 10:10:05 PM PDT by martin_fierro (Chat is my milieu)
[ Post Reply | Private Reply | To 1 | View Replies]

Comment #7 Removed by Moderator

To: Ernest_at_the_Beach
I can't be sure, but from the URL on that supercomputer picture, which ends in the string "superc_wildcat_203.jpg", I suspect that is a BBC stock photo of a supercomputer, that happens to be a Sun Wildcat, which was not the computer used for this research.

The system used for this research is the principal supercomputer at the Max Planck Society's Supercomputing Centre in Garching, Germany, the Regatta, as described at http://www.rzg.mpg.de/computing/IBM_P/hardware.html:


The IBM pSeries Supercomputer

The pSeries "Regatta" system is based on Power 4 processor technology.
Node characteristics are:
32-way "Regatta" compute nodes (eServer p690), equipped with 1.3 GHz Power 4 processors, with a peak perf. of 166 GFlop/s and 64 GB of main memory per node.

Since Oct 2001, an 8 proc test system had been installed.

From January 2002, a six node system with an aggregated performance of 1 TFlop/s and 1/2 TB of main memory has been in operation.

In mid 2002, the system has been extended to 22 compute nodes and 2 I/O nodes with an aggregated peak performance of 3.8 TFlop/s and 1.8 TB of main memory.

In Jan/Feb 2003, the system was moved to the new computer building with significantly increased disk space.

After initial tests with the new IBM High Performance Switch (HPS, "Federation Switch") as of early Sep 2003, the system was upgraded to this fast node interconnect technology in December 2003, together with a system expansion to 4.2 TF peak performance.

There are now 25 compute nodes and 2 I/O nodes, connected by the High Performance ("Federation") Switch, with four links per Regatta node.


The Max Planck Institute press release for this Nature article: Supercomputer Simulations explain the Formation of Galaxies and Quasars in the Universe explains quite a bit more about this work.

Here's the cover of the Nature magazine with the article:


Nature - 2 June 2005.
And here is one of the several impressive images linked off the above press release:

8 posted on 06/02/2005 12:33:41 AM PDT by ThePythonicCow (To err is human; to moo is bovine.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: ThePythonicCow
An article at blog, posted last year under the title: Simulating the Whole Universe, explains some of the compute technology behind this work in more detail:

samedi 4 septembre 2004

 

An international group of cosmologists, the Virgo Consortium, has realized the first simulation of the entire universe, starting 380,000 years after the Big Bang and going up to now. In "Computing the Cosmos," IEEE Spectrum writes that the scientists used a 4.2 teraflops system at the Max Planck Society's Computing Center in Garching, Germany, to do the computations. The whole universe was simulated by ten billion particles, each having a mass a billion times that of our sun. As it was necessary to compute the gravitational interactions between each of the ten billion mass points and all the others, a task that needed 60,000 years, the computer scientists devised a couple of tricks to reduce the amount of computations. And in June 2004, the first simulation of our universe was completed. The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come, at least to people with a high-bandwidth connection. Read more...

Here is a general overview of the project.

The group, dubbed the Virgo Consortium -- a name borrowed from the galaxy cluster closest to our own -- is creating the largest and most detailed computer model of the universe ever made. While other groups have simulated chunks of the cosmos, the Virgo simulation is going for the whole thing. The cosmologists' best theories about the universe's matter distribution and galaxy formation will become equations, numbers, variables, and other parameters in simulations running on one of Germany's most powerful supercomputers, an IBM Unix cluster at the Max Planck Society's Computing Center in Garching, near Munich.

Now, here some details about this cluster -- and its limitations.

The machine, a cluster of powerful IBM Unix computers, has a total of 812 processors and 2 terabytes of memory, for a peak performance of 4.2 teraflops, or trillions of calculations per second. It took 31st place late last year in the Top500 list, a ranking of the world's most powerful computers by Jack Dongarra, a professor of computer science at the University of Tennessee in Knoxville, and other supercomputer experts.
But as it turns out, even the most powerful machine on Earth couldn't possibly replicate exactly the matter distribution conditions of the 380 000-year-old universe the Virgo group chose as the simulation's starting point. The number of particles is simply too large, and no computer now or in the foreseeable future could simulate the interaction of so many elements.

To understand why such a powerful system cannot handle this simulation in a reasonable amount of time, we need to look at the parameters of this simulation.

The fundamental challenge for the Virgo team is to approximate that reality in a way that is both feasible to compute and fine-grained enough to yield useful insights. The Virgo astrophysicists have tackled it by coming up with a representation of that epoch's distribution of matter using 10 billion mass points, many more than any other simulation has ever attempted to use.
These dimensionless points have no real physical meaning; they are just simulation elements, a way of modeling the universe's matter content. Each point is made up of normal and dark matter in proportion to the best current estimates, having a mass a billion times that of our sun, or 2000 trillion trillion trillion (239) kilograms. (The 10 billion particles together account for only 0.003 percent of the observable universe's total mass, but since the universe is homogeneous on the largest scales, the model is more than enough to be representative of the full extent of the cosmos.)

With these ten billion points, the Virgo team faced a serious challenge.

The software [astrophysicist Volker Springel] and his colleagues developed calculates the gravitational interactions among the simulation's 10 billion mass points and keeps track of the points' displacements in space. It repeats these calculations over and over, for thousands of simulation time steps.
The simulation, therefore, has to calculate the gravitational pull between each pair of mass points. That is, it has to choose one of the 10 billion points and calculate its gravitational interaction with each of the other 9 999 999 999 points, even those at the farthest corners of the universe. Next, the simulation picks another point and does the same thing again, with this process repeated for all points. In the end, the number of gravitational interactions to be calculated reaches 100 million trillion (1 followed by 20 zeros), and that's just for one time step of the simulation. If it simply chugged through all of the thousands of time steps of the Millennium Run, the Virgo group's supercomputer would have to run continuously for about 60,000 years.

Because it was obviously unacceptable, Springel and his colleagues used a couple of tricks to reduce the amount of computations.

First, the researchers divided the simulated cube into several billion smaller volumes. During the gravitational calculations, points within one of these volumes are lumped together -- their masses are summed. So instead of calculating, say, a thousand gravitational interactions between a given particle and a thousand others, the simulation uses an algorithm to perform a single calculation if those thousand points happen to fall within the same volume. For points that are far apart, this approximation doesn't introduce notable errors, while it does speed up the calculations significantly.

They used another method for short distance interactions.

Springel developed new software with what is called a tree algorithm to simplify and speed up the calculations for this realm of short-distance interactions. Think of all 10 billion points as the leaves of a tree. Eight of these leaves attach to a stem, eight stems attach to a branch, and so on, until all the points are connected to the trunk. To evaluate the force on a given point, the program climbs up the tree from the root, adding the contributions from branches and stems found along the way until it encounters individual leaves. This trick reduces the number of required calculations from an incomputable n2 to a much more manageable n log10n, says Springel.

After these two tricks were introduced into the software, the simulation started. And it was completed in June 2004, generating about 20 terabytes of results. These results, which represent 64 snapshots of a virtual universe, will be available to all of us in the months to come. But who will really have access to such an amount of data outside universities and research centers? My guess is that the Virgo Consortium will find a way to reduce the size of the snaphots for regular folks. So stay tuned for the next developments.

Sources: Alexander Hellemans & Madhusree Mukerjee, IEEE Spectrum, Vol. 41, No. 8, P. 28, August 2004



9 posted on 06/02/2005 12:40:01 AM PDT by ThePythonicCow (To err is human; to moo is bovine.)
[ Post Reply | Private Reply | To 8 | View Replies]

To: ThePythonicCow
Oops - first line of previous post should have been:
10 posted on 06/02/2005 12:41:38 AM PDT by ThePythonicCow (To err is human; to moo is bovine.)
[ Post Reply | Private Reply | To 9 | View Replies]

To: ThePythonicCow
Ah thanks...

The early beginnings of the next Supercomputer strain:

IBM unsheathes Cell blade server

11 posted on 06/02/2005 4:59:31 AM PDT by Ernest_at_the_Beach (This tagline no longer operative....floated away in the flood of 2005 ,)
[ Post Reply | Private Reply | To 9 | View Replies]

To: Not now, Not ever!
Looks just a smidge like "Collusus, The Forbin project."
12 posted on 06/02/2005 5:16:04 AM PDT by Bloody Sam Roberts (If you only knew the powerrrrr of the Tagline.)
[ Post Reply | Private Reply | To 5 | View Replies]

To: ThePythonicCow
More:

Supercomputer Simulation erklärt die Entstehung von Galaxien und Quasaren im Universum

13 posted on 06/02/2005 5:16:20 AM PDT by Ernest_at_the_Beach (This tagline no longer operative....floated away in the flood of 2005 ,)
[ Post Reply | Private Reply | To 10 | View Replies]

To: martin_fierro

Updates above.


14 posted on 06/02/2005 5:16:47 AM PDT by Ernest_at_the_Beach (This tagline no longer operative....floated away in the flood of 2005 ,)
[ Post Reply | Private Reply | To 6 | View Replies]

To: martin_fierro

Mark for cool links. Thanks.


15 posted on 06/02/2005 5:24:11 AM PDT by Jack of all Trades
[ Post Reply | Private Reply | To 6 | View Replies]

To: Ernest_at_the_Beach
Better Link for terrific pictures:

Supercomputer simulations explain the formation of galaxies and quasars in the universe

16 posted on 06/02/2005 5:32:48 AM PDT by Ernest_at_the_Beach (This tagline no longer operative....floated away in the flood of 2005 ,)
[ Post Reply | Private Reply | To 13 | View Replies]

To: Ernest_at_the_Beach
IBM's Cell processors, as well as the dual core processors from AMD and Intel, and the multiple core processors at the heart of the next generation of Xbox and Playstation video game consoles, are all part of an inevitable trend that has been in progress for a half century now.

There are several natural package sizes for computers - a room, a filing cabinet size box, a rack or drawer, a board, a processor package, and now a cpu core within such a processor package.

The fundamental change, underlying pretty much all else, for the last half century, has been the shrinking of logic, the size of a bit of storage or a logic gate. As it shrunk, first one of the above natural package sizes, then perhaps a decade later, the next size down, became the natural size of a single computer processor unit (CPU) - that thing which executes a single stream of logical instructions on a set of data.

Then, as the natural size shrank a little further, we put more than one CPU in a package. First it was multiple CPU's in a room, then in a box, then in a rack, then on a board, and now in a processor package. There have always been practical limits on how many CPU's we could cram into any particular package size - due to limits on how fast we could pump data into them, and extract data and heat from them. But at each size, we have quickly pushed to get perhaps 2 or 8 or 32 CPU's in a package.

Before now, this work on parallel computing always occurred alongside the work in continuing to shrink CPUs down to fit in the next natural size package.

Due to the enormous expense of original design work on CPUs at the wafer level, and due to the enormous constraints that the major players (basically Intel and IBM) can impose on anyone trying to interlope at that level, I do not think there is a smaller package size.

The processor package, that square inch of branded plastic, semiconductor, copper, aluminum and ceramic, is the last natural size.

Now the use of multiple CPUs has arrived at the processor package size, and will ship in rapidly increasing quantity this year and the next few years.

The breath taking and rapid evolution of computer and hardware architectures over the last fifty years enters a new phase. Like the automobile, which has not changed packaging in any fundamental way in a century now, the computer has now arrived at the full range of packaging, from the large room fulling big honkin NUMA iron shown in pictures above, to the six or seven cores in the processor package of next years Playstation.

This range of packaging sizes is now established, and now we enter a period of refinement and elaboration.

17 posted on 06/02/2005 5:36:47 AM PDT by ThePythonicCow (To err is human; to moo is bovine.)
[ Post Reply | Private Reply | To 11 | View Replies]

To: ThePythonicCow
fulling filling
18 posted on 06/02/2005 5:40:12 AM PDT by ThePythonicCow (To err is human; to moo is bovine.)
[ Post Reply | Private Reply | To 17 | View Replies]

To: ThePythonicCow
" I do not think there is a smaller package size."

Depends. Ever hear of quantum computing? The physical size of the thingie might be limited (it should be big enough to see, I guess), but there could be millions of processors on a "chip."

--Boris

19 posted on 07/07/2005 6:19:49 PM PDT by boris (The deadliest weapon of mass destruction in history is a leftist with a word processor.)
[ Post Reply | Private Reply | To 17 | View Replies]

To: boris
I still doubt we will see a new kind of package, that is the next scale down from current I.C.'s.

Package sizes are determined by economics, thermodynamics and memory bandwidth, and not just by device size.

We have separate processor IC's, circuit boards, racks, cabinets and rooms because it's practical for different companies to compete at a given size, to depend on separate providers from the next smaller size, and sell to markets for the next larger size. This depends on the cost of integration of packages of one size into a package of the next size being fairly cheap.

I just don't see wide spread use of low cost means to integrate pin-head size quantum CPU's into a single I.C., across corporate boundaries.

And except for specialized programs requiring very regular parallel operations, putting thousands or millions of CPUs into a single I.C. is useless, because you can't get the data in and out -- memory bandwidth will be limited by the cross I.C. interconnect technology.

I don't know enough about quantum computers to know if they will run dramatically cooler - if they do then at least the thermodynamic hurdles for such a beast will be less.

Device size reductions have ruled the development of computers for a half century now, known to many as Moore's Law.

Nothing grows (or shrinks ;) at the rate computer device sizes have been shrinking forever. The delightful insanity of the last half century will end.

20 posted on 07/07/2005 8:27:48 PM PDT by ThePythonicCow (To err is human; to moo is bovine.)
[ Post Reply | Private Reply | To 19 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson