You are wrong. They were not following guidelines, though it is unclear that the appropriate guidelines were communicated well. (i.e. the people handling Duncan were clearly not properly trained).
Blaming the CDC when some Dallas hospital doesn't care enough about their staff to train them properly is stupid. And the CDC has changed policy. Active cases are now being transported to appropriate facilities instead of trusting that random regional hospitals know how to train their staff properly. (you make your own conclusions about mid-level health care from that.)
And that the administration is worried about political correctness is a complete strawman. They have said quite clearly that the problem with a travel ban or quarantine would be that it would make fighting the outbreak more difficult rather than better. The best chance here is to get the resources into west africa and stop the outbreak there. Travel bans and quarantines on non-symptomatic people only pointlessly waste resources to make you feel good about your ignorance.
Sorry but you are wrong. Ebola is not transmissible until the patient is symptomatic. So, for example, NOBODY outside the hospital caught ebola from Eric Duncan. It has been more than 21 days since he went in. This is a done deal.
And if we could detect the virus before symptoms set in, then we wouldn't need to monitor for symptoms, we could just test them and be done with it. DUH! Duncan's family in Dallas were "quarantined" because they couldn't bother to make themselves available for someone to take their temperature twice a day (talk about sad). And others have been quarantined because the public is freaked out, not for any medical reason. People being monitored shouldn't travel mostly because if they become symptomatic they may not be in a convenient place to get into quarantine from there.
While I tend to agree, I think is some more subtlety. In its original conception, CGI probably did consider the web inputs as essentially session-level data, which would warrant what you refer to as "semi-persistant" storage in the environment. I would say that web programming has evolved some in modern usage, and a transient-data model as you suggest is probably more appropriate.
But there is plenty of blame to go around. Bash, or anything else for that matter, should not interpret otherwise completely unused environment data in such a way that it gets executed. There are plenty of other contexts outside CGI where that is a problem. Environmental variables are a well-established way for communicating data from parent to children processes. One that is, sometimes conveniently, agnostic about whether that data is intended for or a direct child or the child of a child. But if a program is performing some function based on the content of *any* environmental variable rather than the content of a specific variable or variables, that is likely to cause trouble.
So does that mean a re-analysis of the article on re-analysis leads to different conclusions than the original article?! HA!
But I have the sneaking suspicion that this re-analysis won't be published, which is a whole nother kind of selection bias can of worms.
Um, you realize that Nature is a magazine, not a journal right? Yes they have peer review but they have a heavy vested interest in publishing exciting-but-possibly-wrong stuff, which they do all the time.
And if results were simply fabricated, peer review can't always catch that as others have said. Though sometimes it is obvious if someone is suddenly able to do something that others have been trying to do but failed, but they can't show WHY it worked for them and not for anyone else. Sometimes quality professional journals, especially in experimental sciences, will have higher peer review standards in that direction than a headline-oriented magazine like Nature.