In November 2015 when NVIDIA CEO Jen-Hsun Huang proposed that “machine learning is high performance computing’s first killer app for consumers,” there was only sketchy evidence to back up that claim. Today though, it looks like the NVIDIA chief was just a little ahead of his time.
This week in San Francisco, Intel held its first Manufacturing and Technology Day, an event designed to reassure investors and customers that Moore’s Law is alive and well and delivering the cost and performance benefits it has for the last 50 years. However, to make that claim viable, the chipmaker has recast the law to deal with the realities of a slowdown in transistor shrinkage.
The Engineering and Physical Sciences Research Council (EPSRC) has allocated £20 million for six new tier 2 supercomputing centers spread across the United Kingdom. The facilities are aimed at supporting both academic and industrial users and will house medium-sized supercomputers for scientific research and engineering.
As China, the US and Japan near the finish line in exascale race, the DOE and NSA are sounding the alarm that the United States is at grave risk of losing its dominant position in high performance computing. According to the assessment of the two agencies, “absent aggressive action by the US – the US will lose leadership and not control its own future in HPC.”
The long-predicted demise of Moore’s Law appears to be playing out. Over the last couple of years, Intel and other chipmakers have struggled to keep their semiconductor technology plans on schedule, paving the way for fundamental changes in the computer industry.
Applied Micro announced it is sampling X-Gene 3, its third-generation ARM SoC for servers. According to a report by The Linley Group, the new platform will provide comparable performance to the latest Intel Xeon processors, but at a significantly lower price point.
AMD announced this week it will being shipping its new “Naples” server CPU in the second quarter of 2017, hoping to disrupt Intel’s hegemony in the server market. The upcoming chip looks to be the first CPU from AMD to offer a credible challenge to Xeon in the datacenter in nearly a decade.
IBM has revealed its intentions to commercial its quantum computing technology being developed under its research division. Although the company didn’t offer a definitive timeline or even a roadmap for the product set, it set down some markers on what such an endeavor would entail.
Behind the scenes of practically every weather forecast we encounter today are some of the most powerful supercomputers on the planet. An armchair analysis of the world's top systems reveals some interesting aspects about the HPC technologies and machinery being used to generate these forecasts.
The largest Internet company on the planet has made GPU computing available in its public cloud. Google announced this week that it has added the NVIDIA Tesla K80 to its cloud offering, with more graphics processor options on the way. The search giant follows Amazon, Microsoft and others into the GPU rental business.