Welcome!

Enterprise IT Context for the CTO

Bob Gourley

Subscribe to Bob Gourley: eMailAlertsEmail Alerts
Get Bob Gourley via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Blog Feed Post

Why the U.S. National Strategic Computing Initiative (NSCI) Matters

The post below, by Kristin Hansen with Bright Computing,  highlights IDC HPC User Forum's panel discussions on the U.S. National Strategic Computing Initiative (NSCI). The NSCI was created by an executive order on July 29th. NSCI defines a military agency framework for furthering U.S. "economic competitiveness and scientific discovery" through orchestrated advances in high performance computing (HPC). Pursuing the objectives outlines by the NSCI will have important consequences for HPC users, HPC vendors, and citizens at large, over the next several years.

The NSCI: An “Apollo Project” For Our Time

While launched to relatively little media or public fanfare, the NSCI can reasonably be compared to other sweeping, government-led initiatives designed to accelerate economic, scientific, or other advances in a specific sector.

Various panel speakers at the IDC event cited the U.S. government’s role in expanding rural electrification, fostering space exploration, birthing atomic energy, and – more recently – shaping the Internet as apt comparisons for the NSCI’s vast, transformative potential in the years ahead.

What, exactly, is the NSCI expected to transform? In a nutshell: the world’s capacity to calculate, analyze, and ultimately address some of the most pressing challenges we face as a global citizenry. Think climate change, cancer and other serious illnesses, and economic disparity with its attendant threats to collective

security and well-being. Naturally, many purpose-built supercomputers and other high-performance, clustered systems all over the world are already engaged in important efforts to tackle these and other complex problems. They perform calculations, run simulations, model outcomes, and consume massive levels of computational power (think “petaflops”) in the process. These systems, prevalent in academic institutions, government agencies, and regional or national supercomputing centers, can be thought of as “computing intensive” in their theoretical problem-solving approaches.

Meanwhile, across various commercial sectors, a different kind of computational discipline is taking hold, as organizations grapple to discern meaning from reams and reams of actual market and consumer data. As more and more industries make the painful shift to “big data,” a.k.a. data-driven decision-making and analysis, they are typically cobbling together large data warehouses with the capacity to store, filter, and analyze enormous volumes of data (think “petabytes”) in realtime or near-realtime. These “data intensive” environments can look very different from their “computing intensive” counterparts, tending to be comprised of over-the-counter, commodity components rather than developed as purpose-built systems.

Combining the best of computational and data-driven approaches

The NSCI lists five key objectives (read the document here in its entirety), which collectively seem to underscore one overarching goal: to revolutionize our problem-solving capabilities by combining the best attributes of today’s “computing intensive” and “data intensive” architectures. On the one hand, systems that can perform complex modeling and simulation to derive insightful theoretical outcomes. On the other hand, architectures fast and nimble enough to process and respond to massive volumes of real – rather than theoretical – information. To cite just one example, think of how climate projections could

be enhanced by routinely applying the latest modeling techniques alongside decades, or even centuries, of actual trended data.

Supercomputing for the masses

What’s another way of saying all this? In essence, the NSCI will accelerate convergence of high performance computing (HPC) and big data. Initially, a handful of federal agencies -- the Department of Defense, the

Department of Energy, and the National Science Foundation, to name a few -- will spearhead broader interagency initiatives to simplify and standardize supercomputing platforms, expand availability of technical training and skills, sweep away technical roadblocks (i.e. the practical limits of Moore’s Law) and achieve resulting leaps and bounds in computing performance (i.e. exascale).

Most of these initiatives will focus on enhancing performance at the very upper end of the supercomputing spectrum: the “top” of the Top 500.

Over time, however, as with other broad government initiatives, several “trickle down” benefits are expected to emerge across broader segments of the economy. The fifth stated objective of the NSCI, to “ensure that the benefits of the research and development advances are, to the greatest extent, shared between the United States Government and industrial and academic sectors,” speaks directly to this principle.

As supercomputing becomes more powerful, more standardized, better staffed, and more capable of embracing a blend of computational and data-driven approaches to problem solving, all sectors and tiers of the economy stand to benefit. The convergence of HPC and big data – of “petaflops” and “petabytes” – will bring supercomputing to the masses, enabling more and more of us to participate in solving the world’s biggest challenges.

Read the post here.

To learn more about Bright Computing’s management capabilities for HPC and big data clusters, click here.

To watch extended video coverage of the IDC HPC Forum panel on the National Supercomputing Initiative, provided by InsideHPC, click here.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com