One of the main reasons I got into computing, admittedly a long time ago, was the potential that I personally saw in using this unbelievably powerful new tool towards addressing the needs of the society we live in.
Often at meetings or when speaking to young students, whether at the elementary or even post-graduate level, and occasionally just folks who are simply interested in the topic, I work hard to impress upon them the importance of the benefits of this technology. It’s necessary to start off with a bit of history and education. After all, modern day computing, at least in its current recognizable form, began in the late 1930s. I usually dispel right away the notion that it is something beyond the current grasp of the ordinary layperson, and in fact present it as a major milestone in the timeline of our civilization. I love using analogies and metaphors, so the one I use is to start off with sharing how our world changed when the microscope was first invented. It of course opened a whole new world to us, even if it did take a few thousand years. In a similar fashion, when we reversed the position of the lenses and that gave us the telescope, we went from seeing the ‘tiny unseen’ to the ‘distant unseen’.
In both cases, society’s perspectives changed completely and knowledge and discovery were inevitable to follow. Drawing upon this, the world of computing is such an instrument, but not limited in any way, shape or form, to simply linear dimensions of any kind. Accelerating the speed of calculations, it allowed us to model any world, dimensions were no longer a bound that would limit us in any way. Such was the power, which allowed one to freely imagine well beyond the scope of what was physically possible and thus use simulations to glimpse into the world of the ‘boundless unseen’. If we ever draw a timeline of the microscope and telescope, surely there has to be a place for ‘compute-scope’ on this.
When presented in such a context, usually the audience is quick to seize upon the significance and vast impact of the high performance computing. That is then the starting point to dwell on the various ways in which society can begin the process of applying HPC towards first understanding and then solving the problems which plague us. This was also the major reason I started a little discussion group – Innovative Uses of HPC – which I am happy to report has grown steadily and in a very modest fashion over the last couple of years. Granted, there are many pressing problems and of course, we have to rank and prioritize them, depending upon different geographies, economies, importance, long term versus short term impact, cultural and intellectual needs.
The point here is that technically the challenges are very tractable, however society needs to also grow at a comparable, if not higher pace than the technology (which is evolving faster than us). We need to grow ethically, so that we can make the right decisions along this journey of civilization. There are several success stories we can share and I will definitely address those in future musings. But for now, I would like to write about using HPC for something a lot more immediate impact: predicting earthquakes.
We all live on this planet and if there were any superior external life forms (we are still looking for them), they would marvel how we manage to go about life on this little piece of rock, with a hard external shell and a massive, active interior molten core. This internal ball of energy decides a lot about life for us, and has been doing so since day 1 of the planet formation. We often see its effects in volcanoes and of course the drifting and shifting plates we live on that collide with each other and cause upheaval whenever the laws of physics reach their limits. Of course, I am thinking of the recent earthquake in Nepal, a country where I spent a fraction of my high school days. Yes, several folks have been predicting a major quake, but most of the population were caught off guard. The result: loss of life and destruction of major historical sites plus the heritage impact is indeed, something quite terrible. HPC has been steadily applied towards earthquake research now for a few decades, but we definitely need to increase our focus on this and all related earth phenomena, to help better understand and ultimately make it predictable to perhaps within a few days, so we can avoid or even prepare for the effects.
One country that immediately comes to mind is Japan, which perhaps has the most experience and deepest technical expertise about earthquakes. Related of course are the integrated effects of earthquakes, whether they be on land or underneath the vast waters. If you simply google for ‘HPC earthquake Japan’, you will get over 110,000 results, a few of which are major HPC sites doing earthquake related simulation and research into predicting and finding new ways to interpret data. In the USA, LLNL (Lawrence Livermore National Lab) besides many other institutes such as NCSA-Blue Waters and many others, too numerous to mention, have each done ground-breaking work (excuse the unintended pun here) to advance the science and art of earthquake prediction work. Special mention needs to go out to the Southern California Earthquake Center for the code CyberShake. I will take this chance to also mention excellent work done in Europe – as evidenced by the PRACE ISC Award 2014, SeisSOL is one such instance of how seriously this is being taken.
While I am not familiar with similar work in the Asian subcontinent, Africa or Latin America, I have no doubt readers will share with us many of the good work being done by scientists and practitioners, contributing to the body of knowledge and advancing the state of the art, leading to saving lives and less trauma in the near future.
Opinions expressed are solely those of the author and do not necessarily represent anyone else.