Chinese Supercomputer Runs Record-Breaking Simulation of Universe

Scientists from the Chinese Academy of Science have run the largest simulation of the universe on TaihuLight, the world’s fastest supercomputer. The record-breaking achievement was described last week in the South China Morning Post, which reported that the supercomputer was able to simulate the early expansion of the universe using 10 trillion virtual particles.




Leading the effort was Gao Liang, chair scientist of the computational cosmology group in the National Astronomical Observatories at the Chinese Academy of Sciences, who said that TaihuLight used 10 million processor cores to accomplish the simulation. The 125-petaflop (peak) machine, which is housed at the National Supercomputing Center in Wuxi, is equipped with a total of 10,649,600 cores.

With a Linpack performance mark of 93 petaflops, TaihuLight has been ranked as the fastest supercomputer in the world since June 2016, according to the TOP500 list. The system contains more computational capacity than the next four top-ranked supercomputers on the list, combined.

TaihuLight is powered by the “ShenWei” SW26010 processor, a 260-core CPU developed in China specifically for HPC work. The fact that the system used this custom chip meant the Gao and his team had to write their own software, rather than relying on existing codes developed for more conventional processors.

According to the Post report, the research was made public on July 26, in an article published in Science and Technology Daily, the official newspaper of China’s Ministry of Science and Technology of China. The computational run to perform the simulation was performed in May.

Like most codes used to model the universe, the Chinese version was based on the N-body simulation, which approximates the motion of particles, which are driven principally from gravitational forces. As more particles are simulated, the computation effort intensifies accordingly, which effectively restricts these universe-scale simulations to the very largest supercomputers. As the Post report noted:

“It was only possible to simulate over 1,000 particles with the best computers in 1970s. In recent years scientists reached the trillion-particle level on some of the world’s most powerful machines such as the Titan in the US, the K computer in Japan and Tianhe-2 in Guangzhou.”

The TaihuLight universe simulation broke the record obtained in June by the 19.6-petaflop Piz Daint supercomputer in Switzerland. The Swiss model used 2 trillion particles, which were used to catalogue about 25 billion galaxies. To accomplish this, the Swiss astrophysicists executed their code for 80 hours.

In the TaihuLight effort, the simulation was maintained for just over one hour. But according to Gao, that was due to the fact that the system had other users waiting to use the machine. During this relatively short run, the simulation had run the model to tens of millions of years after the Big Bang. The current age of the universe is around 13.7 billion years*.

“This is just a warm-up exercise,” said Gao. “We still have a long way ahead to get what we want.”

* A previous version of this article erroneously stated that the age of the universe is around 1.3 billion years.  Although there is some controversy around its true age, most scientists agree it is 13.7  billion years old, plus or minus 0.2 billion years.

Current rating: 4


John Blommers 1 year, 1 month ago

The article incorrectly reports the age of the universe. According to Wikipedia it's 13.799±0.021 billion years old, not the 1.3 billion years.

Suggestion 1: fact check every number in the article.

Suggestion 2: N-body simulations attempt to solve a complex, coupled, nonlinear system. Therefore the system is chaotic and extremely sensitive to initial conditions. I suggest the simulation results are more about bragging rights and less about accurate modeling.

Link | Reply

New Comment


required (not published)