My favorite part of the article is the photo that accompanies it. Two of my scientific visualizations are on there, the red/yellow picture of an Alzheimer's plaque being attacked by drugs (behind the N of TITAN) and the silver structure of a proposed ultra-capacitor made from nanotubes (to the right of the N).
Please do your homework first. While the supercomputers at Lawrence Livermore, Los Alamos, and Sandia National Laboratories are primarily used for nuclear weapons work, the work of keeping the country's huge stockpile safe and reliable is a gigantic job, especially if you don't want to actually detonate any of the warheads. Yep, that's the trick. Simulate the ENTIRE weapon, from high explosive initiation all the way to final weapon delivery. With all of the hydrodynamics, chemistry, materials science, nuclear physics, and thermodynamics modeled accurately enough to be able to say with confidence that the entire stockpile is reliable and safe. Hard job! Someone likened it to having a fleet of thousands of cars that you can never start, but must certify are road-worthy the instant you turn the key. For 50 years.
But let's go past this. There are three other major Department of Energy laboratories that have major computing centers: Oak Ridge, Argonne, and Lawrence Berkeley National Laboratories. Beyond just the nuclear weapons work that the first three labs do, all six labs use their massive computing power to advance the understanding of the Earth's changing climate, develop new materials, design new battery technologies, design new drugs, impact energy efficiency in vehicles and buildings, understand geology and groundwater propagation, help develop new power grid systems, design technologies for carbon sequestration, and delve into the origins of the universe. "Left over from the glory years"? Hardly.
And let's go beyond the Department of Energy. The National Science Foundation, as you suggest, has funded high-performance computing for years. There are at least five major computing centers that the NSF funds for an even wider range of scientific computing endeavors: the San Diego Supercomputing Center, the Pittsburgh Supercomputing Center, the National Center for Supercomputing Applications (NCSA) at the University of Illinois, the Texas Advanced Computing Center (TACC) at the University of Texas at Austin, and the National Institute for Computational Sciences (NICS) at the University of Tennessee Knoxville. If you want to get a small sense of what the NSF funds in this area, look at the XSEDE web site (https://www.xsede.org/).
(Disclaimer: I work for Oak Ridge National Laboratory's supercomputing center, have worked at Lawrence Livermore National Laboratory's supercomputing center, and am currently helping to run the University of Tennessee NICS computing center.)
"Why waste negative entropy on comments, when you could use the same entropy to create bugs instead?" -- Steve Elias