Granted, but I would have expected that this flood of hacked information would be showing up in the black markets somewhere. As I recall, the way we first learned of the Target hack job was because the stolen information was showing up in these markets and was being used. Is there any evidence that this is the case for this treasure trove of information?
OK, so if the site is so damned vulnerable why hasn't it been cracked by a Black Hat yet? Access to this sort of information is the wet dream of most hackers-for-hire. TFA quotes a Government person saying that the site is secure. The White Hat hackers say it isn't. Unless someone is lying about there having been no break-ins yet, then I have a hard time accepting that the site is a plum waiting to be picked by the next script kiddie that comes along. I could see that there would be a desire to cover up any hack job, but I don't know that a cover-up of something that juicy could hold up for long. Some missing pieces to this story.
The summary (at least) is a bit off on the description. dtmos is close, but still no cigar. The large flare produced three outputs of concern: high levels of x-ray radiation (photons), high levels of high-energy protons, and the CME which is a blast of low-energy (for the sun, anyway) plasma into the solar wind. The x-rays arrive first (at the speed of light, natch) with the high-energy protons not far behind. The shock wave produced by the CME arrives several days later. All of these can cause problems with spacecraft in near-earth orbit, but I suspect that the concern here was those high-energy protons which can damage electronics (and people - this is radiation in the bad old sense). The flux level of those beasts is dropping, but I'm sure NASA is concerned about this sunspot region producing another large flare with another hit from high-energy protons.
The CME, if the shock it produced in the solar wind indeed reaches the earth (this one likely will), can alter the fluxes of high-energy electrons in the earth's radiation belts and will generate active auroral conditions. The ISS is well below the radiation belts, so the CME-produced effects near the earth probably aren't as much of a concern here.
Just as there are nightmare PHB managers, there are useless waste-of-space techies. Assuming that we're talking here about apples and apples (competent managers and competent techies) they bring different, and damn-near equally important, skills and assets to the table. I have spent several decades in the science-for-hire business (research in a private company, not university related), and you see the other side of this problem quite often. Highly competent science types who take on management roles rather than hire someone with the right skills because "I am a PhD and can do this silly management stuff with one hand behind my back." Those are the companies that run into problems big-time. Any good science/tech business of any size needs to have good managers who have been trained for what they do and are competent at it. If both sides recognize the inherent worth of the other (again, assuming competence all around), and if they stay off each others turf and call on the other when their expertise is what's needed, this is what makes a company run.
Seattle has a similar situation and has a bike-friendly mayor who's pushed the issue and is likely to lose his upcoming bid for re-election (not solely because of bike issues, but he's known as Mayor McSchwinn and it's one of several things that voters are unhappy about). I lived in Los Angeles in the mid-1970s and bicycled from near Culver City to the UCLA campus in Westwood the entire time. I feel very fortunate to have avoided an accident during that time and had many near-misses. US cities are not set up for bicycles, and making them so is an expensive proposition. (OK, so Boulder is an exception, but they've a lot of money to spend in stuff like this). Yes, bicycling is better for you than sitting on your ass in a car, but spending a lot of scarce tax dollars catering to the biking minority is a very inefficient use of transportation money.
Assuming TFA's numbers are correct, I'd bet that much of the problem is that no agency, be it government or commercial (and particularly commercial) wants to spend it's money seeing if published results are reproducible. Additionally, no one ever won a Noble Prize for excellence in reproducing others' results. Verification of results is key to science, but this is one of several aspects of doing science right that the funding agencies either don't want to, or can't (as in Congress looking over the shoulders of managers at the NSF), pay for. Everyone wants "everything, all the time" without paying for it, and this is the sort of thing that happens when decisions are driven by the money people (who may be scientists, to be fair) and not the people who know what the hell is going on.
Yeah, yeah - code clean, test-test-test, document-document-document, have separate test/run machines that are configured the same, yada yada. This is all well and good, and any halfway-decent developer knows all this. However, software development is not done in a vacuum and each and damn near everything mentioned is involved in cost/time benefit analyses when crunch-time comes (which it always does). With some exceptions, when I see a company that's saddled with horrible old legacy codes that nobody can understand, often a large measure of this is paybacks (for not adequate funding and poor schedule planning) being the bitch that they are. How to do things the best way are well known, it's just that the best way is more expensive (in the short term, which is the only term business understands these days) and takes more time than the average business will wait. If the bottom line is get something done that sorta-kinda works as fast/cheap as possible, you get spaghetti code that even the guy/gal who developed it can't follow.
The Washington State's exchange website, for which the state paid $54 million to Delloite LLC, hasn't been a rollicking success either. I'm trying to wrap my head around why it costs $54 million to set up a pretty straight-forward website (costs evidently do not include hardware, just people/time/software). I believe that cost was over half what the state received from the feds to set up the exchange. Details here (such as they are).
Is it possible that software is not like anything else, that it is meant to be discarded: that the whole point is to always see it as a soap bubble?