Viewing posts for the category In Depth

DOE Shifts Exascale Plans into High Gear, Asks Supercomputing Vendors to Submit Proposals

The US Department of Energy (DOE) has announced a Request for Proposals (RFP) to develop at least two new exascale supercomputers for the DOE at a cost of up to $1.8 billion. The RFP was issued under the moniker of CORAL-2, representing the second phase of procurements for exascale machinery for Oak Ridge, Argonne, and Lawrence Livermore national labs.

Where Memory Meets Storage

The last 18 months have seen NVMe drives rise to the forefront of high-performance storage technology, becoming the preferred option for anyone with sufficient budget to include them in their procurement. Only a paucity of PCI lanes and the unfortunately timed spike in NAND prices have tempered the growth in these high-end storage sales. But both are temporary problems.

NVIDIA Refreshes V100 GPUs, Upgrades DGX Lineup

NVIDIA kicked off this year’s GPU Technology Conference (GTC 2018) with a trio of announcements about new Tesla products, including an upgraded V100, an NVLink switch, and a new DGX-2 machine learning platform.

OpenPower Gathers Momentum with Major Deployments

The prospects for OpenPower got a big boost this week, with the announcement that Google has deployed Power9 servers in its datacenters. That revelation was joined by the news that Tencent, PayPal, Uber, Alibaba, and LimeLight Networks are all adopting Power-based technology to support their various businesses. The announcements were made at the OpenPower Summit taking place in Las Vegas, Nevada.

TOP500 Meanderings: HPCG Gains Steam as Alternative Benchmark for Supercomputers

After just three years in the field, the High Performance Gradients (HPCG) benchmark is emerging as the first viable new metric for the high performance computing crowd in decades. The latest HPCG list compiled last November shows 115 supercomputer entries spread across 16 countries.