However, there is an earlier survey that found that 96% of highly published climate scientists agree that a significant amount of global warming is caused by mankind.
Validation comes form evidence, not consensus.
BUT consensus also comes from evidence.
If you are an expert in a field, which means that you have personal experience doing research in that area, and have been following the literature for several years, you base your judgment on the evidence.
But if you are not an expert in the field, you do not have the background knowledge to correctly evaluate the evidence.
In this case, your best proxy for the evidence is the consensus of the people who are experts.
"But the consensus can be wrong!" you reply.
Yes, it can. But as the saying goes, "The race is not always to the swift, nor the battle to the strong—but that's the way to bet."
The reason that people who successfully challenge the consensus become famous is because it is so rare. Everybody always cites the same handful of examples—Galileo, plate tectonics, germ theory, relativity. They do so because there really aren't that many clear examples of a broad scientific consensus being completely overturned.
The consensus is not infallible, but most of the time, it turns out to be right, or very nearly so. This is particularly true in a mature field like climate science, where the consensus is the result of the work of many scientists over many decades.
There are plenty of non-retina LCD displays that have short-term image persistence. In my experience most LCD displays do this to some degree, although some seem to be better than others in this respect. Apple's claims about their screen regard high resolution and good colors--I don't think they've made any particular claims about short image persistence. Apple has made it pretty clear what they mean by a "retina" display--one that at a normal viewing distance has a pixel density greater than what the retina can resolve.
Many LCD panels exhibit short-term image persistence that you can see when you switch from a display with very bright objects to one that is dark. This is different from permanent "burn-in." One might prefer a display that shows less or shorter persistence, but unless Apple made some sort of claim about low image persistence, I don't think that it can reasonably be regarded as a product flaw.
Class action suits over consumer electronics are basically a scam that benefits nobody but lawyers. The lawyer offers a lowball settlement that is cheaper than the cost of going to court even if the company wins, so the company invariably settles. The consumer participants of the suit get a pittance that is not even worth the value of the time they spent filling out the paperwork. And the lawyer gets a little piece of each of those tiny settlements, which adds up to a nice payday for hardly any work.
It doesn't seem like citizen ownership of small arms has been all that much of a factor in helping insurgencies to resist soldiers with body armor and military-grade weapons. IEDs and captured/smuggled military-grade weaponry like RPGs and Stinger missiles seem to have been far more of a problem for the military than citizens taking pot shots will small arms.
It may be that the problem is not with the Wii U but with consoles in general. The availability of smartphone and tablet games may have changed the gaming landscape permanently. Game enthusiasts may turn up their noses and point out that mobile device games lack the diversity of controls and graphics quality, and most of them are simplistic compared to mobile games--which is true--but it is unclear just how much that means in terms of the market for consoles in the $ 200+ price range running games in the $50+ price range.
For many casual gamers, mobile games are good enough, considering that they run on a device that you already own, so you don't have to buy dedicated hardware, and the games are about the price of a candy bar. Each game may not have the depth of a console game, but if you get tired of one, there are plenty more. And at the other end, the true gaming fanatics play on computers, not consoles.
It may be that consoles will end up being neither here nor there, appealing to a diminishing market of moderately enthusiastic gamers that is too small to yield the profits required to justify the sort of massive development effort needed to create games with the graphics and play sophistication that console gamers expect.
Of course, Nintendo's systems have never been hard core--they've always been more oriented toward families, so perhaps Nintendo is more vulnerable to competition from mobile device gaming. On the other hand, so far there doesn't seem to be a huge degree of anticipation of new gaming systems from Sony and Microsoft, and both companies seem to be having difficulty articulating just what their new-generation systems will offer to convince consumers to shell out hundreds of dollars in up-front costs for new consoles.
I skimmed over the original paper. It presents an interesting hypothesis, but the evidence is correlational, the analysis is complicated and indirect, and the relationship they found is not simple (not that bell-shaped curves can't occur, but they offer a lot more freedom in fitting data than monotonic relationships). If anybody actually is basing policy recommendations on it, I'd question their motives. But the attack on it seems a bit over the top, and I get the impression that the authors of the attack don't even want these sorts of ideas discussed, so I'm suspicious of their motives as well.
In any case, it seems like a very minor tempest in a teapot over a very tentative hypothesis based on weak evidence. I don't see what it has to do with the "soul of science."
The graph you link to shows the same thing as Krugman's (a flat line on a semilog plot is still flat on a linear plot): a jump in 2009 due to the impact of the depression on automatic safety net programs such as unemployment and food stamps (which continues, since employment has not recovered), and then nearly flat thereafter. Here's another one of Krugman's FRED plots (on a linear scale, if it makes you happier) showing federal spending as a fraction of potential GDP.
Are you seriously rolling out the much-debunked myth that there was a scientific consensus in the 1970s that we were heading into an ice age? I was reading the scientific literature back then, and I can tell you that that is simply nonsense. This notion seems to date mostly from a sensationalistic article in Time magazine based on the views of a fringe scientist. All of that literature can be found in any major university library, and much of it is available online, so you can check for yourself. Even in the 1970s, scientists knew that there was the potential for CO2 from fossil fuels to cause warming. If you aren't industrious enough to read the literature for yourself, others have done it for you
The notion that climate is "complete chaos" is also wrong. Weather is chaotic over the short term, but over the long term there are indeed rules--climate is determined by the overall solar energy balance of the globe, in which CO2 plays a major role--in fact it is impossible to explain why the earth (or Mars, or Venus) is as warm as it is unless you accept the warming effect of CO2--and once you do that, global warming in response to fossil fuel releases of CO2 follows inexorably.
Actually, there have been numerous studies finding that minimum wage increases produce negligible or zero increase in unemployment. The Wikipedia entry provides a balanced discussion of the economic theory and data.
It doesn't prove anything except that on an increasing trend line the highest values are the more recent ones....
Yes, but many Republican politicians have denied the the reality of that trend. The year to year increasing trend in average temperature is small compared to the random fluctuations of weather, so it is possible to convince people who are unfamiliar with the actual science that it is some kind of "liberal scientist" fabrication. But people do notice extreme weather events. So while the strongest evidence of the upward trend is in the measurements of day-to-day temperatures, the increase in weather disasters is something that the average person can perceive and relate to.
Much of the deficit increase in 2009 was due to existing "safety net" programs such as food stamps and unemployment insurance that kicked in in response to the depression, which was already underway when Obama took office. The rest was due to the financial bailout, in which Obama followed through on the bailout devised under the Bush administration. Obama brought an end to the growth in Federal spending
The anonymous ship sailed a long time ago for pretty much anybody who has ever done anything public under their own name. I could be easily googled well before FB came along. That doesn't particularly bother me; I don't have any mortal enemies that I'm hiding from, and I'd like any old friends to be able to find me if they want to do so. The rules haven't changed: if you really want to be private for some reason, don't do anything public (and anything on the internet is public) under your real name--for that matter, you might want to consider changing your name to something generic with 100,000 Google hits that aren't you.
So I have a couple of headless keyboard-less macs that I access by remote access (often from my iPhone or iPad). They run the same software, and do the same things as my desktop mac with keyboard, and they most certainly are personal. Are they really not "personal computers"?