Over the course of the last year, our writings in The Hill have addressed emerging technology issues including big data, the Internet of Things, automation/artificial intelligence and cybersecurity. Our focus has been the growing implications of these new technologies for both the public and private sectors. There is a common threat that ties all these tech innovations together: the collection, analysis, and utilization of data via advanced computing capabilities, specifically supercomputing and high-performance Computing (HPC).
In today’s world, computing rules almost all that we do. The exponential upsurge of data and its uses directly impact the critical infrastructure of society, including healthcare, security, transportation, communications and energy. Organizing, managing and analyzing data, though, is more important than ever. The U.S. military and the intelligence community depend on maintaining a qualitative edge in processing power that factors in the design, creation and operations of many technologies and programs of national security interest. Supercomputing and the corollary of high-performance computing have become the means mechanisms for those vital tasks.
Seymour Cray is commonly referred to as the “father of supercomputing” and his company, Cray Computing, is still a driving force in the industry. Supercomputers are differentiated from mainframe computers by their vast data storage capacities and expansive computational powers.
The website Techtarget.com provides a strong working definition of HPC: “the use of parallel processing for running advanced application programs efficiently, reliably and quickly. The most common users of HPC systems are scientific researchers, engineers and academic institutions. Some government agencies, particularly the military, also rely on HPC for complex applications.” HPC works hand-in-hand with supercomputing as it requires the aggregation of computer power to address problems and find solutions.