Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:I save money! (Score 1) 439

Sure but the public argument is that you are a dastardly fiend for not doing Argon filled glass and a Prius anyway, damn the cost, because the real cost of not doing so is the end of the world when the sky falls and the seas rise.

It's really a strange modern variant of Pascal's Wager -- by making the cost of the "catastrophe" associated with AGW sufficiently great, they create an unlimited bias in the expectation value of outcomes. Even if "catastrophic" global warming is a 1% chance, by making it a million times more expensive (if it happens) than doing nothing, they can make it a 10,000 to one bet that mitigation is the economical choice and justify any investment or political strategy to accomplish it, just as Pascal pointed out regarding the "bet" that God exists -- moderate costs if you bet that God exists and are wrong, but infinite punishments (costs) if you bet God doesn't exist but are wrong (forgetting entirely about the benefits in both cases as well, although they are equally skewed between "none" for the atheist or theist alike if there is no God, "infinitely good" if there is a God and you pick the right God and get into "heaven" for eternity).

Again, otherwise I mostly agree with your reply. Personally, I've got R30 in the attic, low-E double paned windows on the house, and three uber-efficient furnace/ACs doing the three floors of my house, plus a tankless water heater. My energy expenses are indeed about a third of what they were (or would have been, extrapolating usage and prices) and yes, the amortization schedule is long enough that each of the investments will not really pay for the cost of the money in less than 15 to 20 years. That's sort of "break even". A 10 year amortization would be much better. Of the changes, I personally love the windows the most. Good windows matter, and one can e.g. clean them from the inside and they lock securely and have little flip-thingies so that you can leave them open and not let humans raise them fully to get in and they are very, very quiet compare to the old crappy windows we had. And I don't even "believe" in CAGW, and suspect AGW is minimal against the background GW of completely natural origin associated with coming out of the LIA.

As you say, many of the things done won't "sacrifice our economy", and the reason I introduced the extreme version of the argument is to draw attention to the parallel to Pascal's Wager, where a sufficiently large negative payoff justifies any strategy that might avoid it. Similar things apply to the risk of kilometer scale asteroids falling, global pandemics occurring, nuclear war occurring, even terrorism -- make the negative payoff high enough even at low risk, amplify the public perception of risk, and people will cheerfully give up their civil liberties and endure enormous expense and inconvenience to mitigate what is really a tiny risk. We forever lock the barn door once a horse is flown but ignore the open window through which our chickens are about to fly.

I personally would like to see the extreme edges taken off the entire debate. Don't predict meter-high sea level rise (and hence unimaginable "disaster", with unbounded costs) while the actual measured rate of sea level rise has been 9 whole inches since 1870 and is currently around 3 mm a year, or less than 9 inches more by the end of the century (assuming one cherrypicks the least favorable interval to use to compute the rise -- a fairer estimate would be a constant extrapolation of the post 1870 rate). Don't predict the melting of the ice caps (again, disaster) with Antarctic ice on the rise and just about matching the decrease in Arctic ice -- say instead that we don't really understand what and how polar ice is modulated. Don't predict horrible storms and droughts (disaster, worth any amount of money to mitigate) when there is zero evidence of increased drought, zero evidence of increased frequency or severity of storms. All of these things are predicted, but they are not actually happening. There may well be some things that are happening, but they are less catastrophic and hence don't justify the enormous expenses and measures proposed to mitigate them.

IMO sensible things to do about this unknown "problem" are all ones that are beneficial whether or not the problem exists. They include continuing to support aggressive research into energy, especially solar and nuclear (both fusion and alternative fission). I'm up in the air about wind -- there are large problems with wind generation and it has the feel of a technology that is not yet mature being implemented with enormous subsidy, at a loss. I think carbon trading is silly beyond measure and should be abandoned instantly, and the world seems to largely agree as it is being abandoned. I think that it is a mistake to demonize electricity produced from coal and to artificially inflate its price, at the same time I don't think it is good to actively subsidize it. At some point things like PV solar will simply overtake coal as cheaper ways of making electricity, and the problem (if any) will then solve itself, long before the end of the 21st century. If anything, I was annoyed with Obama because he didn't invest enough in developing PV solar, not because he did too much. The problem isn't with the technology, it is guaranteeing the markets. PV solar is already break even to win in many markets, and in a decade it will be an increasing win in almost all markets, taking over a steadily growing share of total energy production worldwide. A breakthrough in battery technology could make the market explode overnight at current costs, as conversion and storage are two remaining major barriers to existing PV (which already costs ~1/watt full retail for large projects, more for consumer projects, just outside of a sensible amortization for me personally).

Drop solar to necessary to set a government policy or take public action to regulate carbon. Why pay money for fuel when sunshine is free? It's the amortized cost of collecting and storing it that is the problem, and that is strictly a technological and engineering problem, precisely the kind of thing humans excel at solving, when they really want to.

So in the end, it isn't whether GW is happening -- of course it is, look at the thermometric record post 1800. It isn't whether or not it is partly anthropogenic, especially when we do not know precisely what fraction of the observed warming is anthropogenic -- almost none of the warming observed before 1950, and post 1950 separating AGW warming "signal" from the natural GW "signal" againstthe substantial natural "noise" clearly visible across the thermal history is "difficult" to say the least and easily biased to agree with prior beliefs. It is whether or not GW A or not will be C. Only CAGW justifies the "at any cost" mitigation that delivers both political and economic power into those who -- curiously -- both are the most vocal proponents of CAGW and the most visible beneficiaries of the money and power used to mitigate it.

rgb

Comment Re:Apples and Oranges... (Score 1) 608

Well, this isn't quite true. If you eat (say) an ounce of high-grade, THC-rich bud mixed into a bowl or two of chili you will learn that a) THC really is a hallucinogen. Who knew?; b) that while you will not be dead (no known fatal dose), you will be pretty much incapable of moving or doing complex anything, inclined to doze off, and quite capable of getting lost walking a few blocks down the street to get home. Been there, done that. The same is true to a lesser extent doing bong hits or smoking joints -- marijuana has a dazzling array of psychoactive compounds with quite different effects and concentrations. The one thing about doing bong hits or smoking joints is that it tends to be self-limiting as one reaches the point of physical impairment because it is incremental, and the time of maximum impairment is relatively short.

So the safest thing to say is that somebody who isn't trying to get maximally stoned and who isn't smoking or eating weed at a rate or in a way that boosts THC up to the levels that seriously mess with your cognitive abilities is likely to self-regulate their high to levels that leave them quite functional and capable of driving with only a slightly elevated risk of accident, equivalent to drinking a beer or two no more. This is, in fact, the rule far more than the exception.

However, one certainly can get truly wasted on marijuana, hashish, pure bud, and even get differently wasted on different variants of weed that have been bred for different concentrations of different cannabinoids.

You, and others reading this thread, might want to take a quite detour through:

http://en.wikipedia.org/wiki/Cannabinoid

that works through some of the psychopharmacology of pot. Sound bit facts: The human brain has more receptors that respond to cannabinoids than to components of any other drug or substance. There are many, many such cannabinoids. The brain's reaction to different cannabinoids in different strains of pot is therefore quite different depending on the particular mix and the overall concentrations. For example (from the article) Cannabis Sativa strains are known for their "cerebral high" with relatively little body involvement and leave one reasonably functionally unimpaired, at the cost of introducing a certain amount of anxiety/paranoia. Cannabis Indica, OTOH, is quite sedative and not a good idea for people who have to function while high. These factors are used in the medical marijuana business to literally prescribe different strains of pot for daytime and nighttime use, and is also a major factor in genetic crossbreeding carried out by growers legal and illegal across the country.

A big dose of high-potency Indica strains is not the best thing to take onboard right before you plan to drive through rush hour.

rgb

Comment Re:Actual Detection of Impared Drivers (Score 2) 608

Ah, silly beanie, video game tests won't work, not for somebody who has "practiced" playing video games high (which would very likely be everybody under the age of 30 and a lot of people older than that).

Back in the days when I used to get high daily I also used to play pinball and ping pong and other games involving nearly instantaneous reflexes in order to succeed. I was truly excellent at both, high. I played the best evening of ping pong in my life high one night, with a friend who was also high. We were literally smashing the ball back and forth at maximum for volleys of twenty or more exchanges at top speed before somebody would miss, looking for a moment like ping pong olympiads. Pinball ditto -- marijuana often increases your ability to concentrate, and does not interfere with your reflexes in anything like the way alcohol (a depressant that will eventually render you incapable of coordinated movement or thought) does.

As I remarked above, "doubling the risk of a fatal crash" is multiplying a number near zero by two, and roughly matches the increase in risk of drinking a single beer. Marijuana is enormously variable, and so the best thing to do is deal with visible impairment and not worry about chemical tests at all.

rgb

Comment Re:Field Sobriety Test (Score 5, Interesting) 608

Bear in mind also that the normal risk of fatal crashes is low, so doubling it is doubling a number very near zero as it is.

Contrast that with alcohol (quote from a 1991 NIH article):

"Based on driver fatalities in single-vehicle crashes, it was estimated that each 0.02 percentage increase in the BAC of a driver with non-zero BAC nearly doubles the risk of being in a fatal crash."

That is probably not quite a beer's worth of alcohol for most body weights. So to put it another way, somebody who smokes pot while driving -- not "before", but during (a thing that in my youth I did with remarkable frequency) -- is roughly as impaired as if they had had just consumed a single beer. At those levels one does have to wonder about the error bars in the study -- statistically resolving one near-zero from another near-zero is actually remarkably difficult and requires ever so many samples and a totally unbiased sampling scheme with a complete lack of confounding variables -- so your assertion that the actual risk might even go down in those that aren't smoking pot and drinking a beer (where the latter is also difficult to detect and also doubles your risk all by itself) is not without possible merit.

Again from the article here: http://www.ncbi.nlm.nih.gov/pubmed/1875701:

"At BACs in the 0.05-0.09 percent range, the likelihood of a crash was at least nine times greater than at zero BAC for all age groups. Younger drivers with BACs in the 0.05-0.09 range had higher relative risks than older drivers, and females had higher relative risks than males. At very high BACs (at or above 0.15 percent), the risk of crashing was 300 to 600 times the risk at zero or near-zero BACs."

Note that at BAC's that are still in the legal range in most states, single car fatalities are nearly an order of magnitude greater than the single "doubling" of risk for immediate use of marijuana. That strongly suggests that the best thing to do about "impairment" from marijuana is -- ignore it, or as suggested above, use a field sobriety test, not a blood or saliva test. It is more or less irrelevant to driving skill. I would say (again, based on extensive experience) that this is not entirely true -- one can eat or smoke enough, potent enough, marijuana that driving is ill-advised, but in those cases field sobriety tests would be nearly impossible to pass as well. But it is actually somewhat difficult to get that stoned, and most pot smokers that I knew didn't want to drive when they were -- too scary.

But the simplest proofs are this. Whether or not it is legal, smoking pot and driving has been nearly universal forever among those that smoke pot. Most states are utterly unable to test for it, yet estimates of prevalence of usage (almost certainly low) suggest that anywhere up to 1/3 or 1/2 of people in certain age ranges at least occasionally smoke. Yet there is no positive association with this same group being a high risk on the road, outside of its tendency to drink. Alcohol is indeed a dangerous substance when it comes to driving, for obvious reasons, even for relatively small amounts. Pot is not, not until consumption is at extreme levels.

The last thing that confounds this is age. The distribution of fatal and non-fatal accidents with age is quite scary. A stoned 40 year old -- I mean a seriously wasted 40 year old stoner -- with a risk of accident 3 times his age-linked norm -- is a safer driver than a stone cold sober 19 year old. "Silverbacks" -- drivers on the high side of 75, where one's eyesight, hearing, and brain are all breaking down -- are safer still. Why? Because they drive (sober or not) carefully, and in particular far more conservatively than younger risk taking overconfident drivers. I'm living through my own sons' driving experience -- one at age 17 has his first car, now multiply scarred from driving it a whole month. One now 22, who at 18 took his eyes off of the road for a second on a curve stone cold sober and totalled the car he was driving (and fortunately walked away with nothing more than a couple of scratches). The one who hasn't done any serious damage to a car is the one who is now 25 but who waited until he was 18 to get his license at all and who is by his nature someone who doubts their own abilities and hence drives just a bit scared, even more so now that he's a parent and has given up drinking altogether. It is this youth-related rashness that is by far the greatest risk factor on the road -- greater than alcohol (and linked to alcohol!).

The sane thing for states to do would be to lower the drinking age to 16, legalize pot and encourage its use instead of alcohol among the younger set (ideally in metered, legally regulated concentrations since marijuana is wildly variable in THC type and content), and raise the driving age to 19. Before age 19, issue only provisional driver's licenses -- daylight hours (6 am to 6 pm) plus driving directly to or from school or work, otherwise mandating an adult licensed driver such as a parent in the car, with heavy penalties (a year with no driving privileges at all) for driving with a nonzero BAC. Raising the driving age over 20 would be even better -- 20 is something of a cusp for driving risk, the point where it starts to come down. As it is, it is far more dangerous to be 17, male, sober and straight, driving a car than it is to be a stoned 60 year old who has had a couple of beers, especially if there are other 17 year old males in the car to show off for.

rgb

Comment Re:Easy (Score 5, Insightful) 608

A mere 80 or so years too late, of course. But better late than never.

Now if any state had the testicular fortitude to challenge them over their utterly unconstitutional use of the threat of withholding federal highway funds from states that failed to raise the drinking age to 21, we might see a restoration of sanity in that direction as well. Otherwise we might as well just ditch the constitution and abolish state and local government and get it all over with.

But getting the US government out of the marijuana game as the first step to getting it largely out of the drug game altogether might be good first steps to dismantling the current police state, and in the process saving perhaps 100 billion dollars (in all costs) nationwide. Maybe more -- drugs are roughly a half-trillion dollar business globally, and laundering drug money is a major mainstay of our banking system and creates a veritable shadow government with a steady stream of untaxed, illegal income that produces compounded wealth and disproportionate power for those that are involved.

It also opens up the states that legalize it to entirely new (taxable, now legitimate) industries -- not just recreational pot but an entire spectrum of hemp-derived products that are difficult to impossible to produce at this time. The hemp plant was enormously useful before it was made illegal, and to some extent was made illegal because it was so useful. I wish NC would follow in CO and WA's footsteps, because hemp would make an ideal cash crop to replace tobacco (the real "killer drug" of the US).

rgb

Comment Re:I save money! (Score 1) 439

Goodness, do you know, I put exactly the same curve on top of the UAH data and it looks remarkably different. Although again how well or poorly it does depends a lot on how you cherrypick the insertion point, no doubt. Bottom line is UAH 33 year anomaly is a whopping 0.33 C -- which works out remarkably accurately to 0.1 C/decade, 1 degree C per century, with a lot of that increase coming from a single discrete event -- the El Nino warming of 1997-1998. It was nearly flat before (within noise). It has been nearly flat afterwards (within noise).

Could UAH suddenly go straight on back up? Sure. Has it? No. Could it go down? Absolutely (it has gone down into negative territory several times in the last couple of years, and there are reasons to think that the world is in a weakly cooling phase now that the PDO has reversed). The point again isn't that AGW isn't a possibility, that CAGW isn't a possibility -- it is that it is by no means a certainty. The George Mason survey of members of the AGU and AMS found that it isn't even true that all climate scientists think that AGW, let along CAGW, is a certainty, and the majority believe in AGW but not the C.

So when a top article cites warming consistent with the C when temperatures have been basically flat for 15 years and rising at only 0.1C per decade over the last 30 -- during which the bulk of the supposedly "anthropogenic" part of the post LIA warming occurred -- well, believe it if you want to, but personally I want something like believable data to support the assertion. In the meantime, I remain politely open minded.

rgb

Comment Re:I save money! (Score 1) 439

I agree with pretty much everything you say. Note well: I was being a bit sarcastic (in context) about things like the Ordovician-Silurian and last glaciation because I was replying to somebody who picked a particular interval that showed extreme warming "at the 95% confidence level" which is and remains bullshit. The 15 year interval isn't quite cherrypicked -- it was stated -- some time ago, and by proponents of CAGW -- that it is the outer limit of an interval where natural variation could prevent the inevitable progression of AGW on a catastrophic scale. The point being, that since we are at that limit, it is increasingly unlikely (the longer the planet stubbornly resists warming to suit the theory) that the theories that predict extreme warming (not some warming, but extreme warming) are correct.

Their response, of course, was to participate in ad hominem and doubt my honesty, which is typical of "accepters" (if one might introduce the moral opposite of "deniers") who don't want to consider any of the negative evidence for CAGW any more than the deniers want to accept or acknowledge positive evidence. Sadly, one could play logical fallacy bingo all night and all day on the "accepter" or "denier" blogs.

As for this:

So you think changes are a net cost? Making our energy usage more efficient is more often a net gain. Worth doing regardless of whether there is global warming. Of course you can spend hugely on things such as expensive materials that are lightweight, but there's no need, not when there is so much low hanging fruit we're ignoring. We blow a lot of money on peacock style displays. People buy large vehicles and houses for the sake of appearances. Surely we can find some other way to show off that doesn't risk the climate. There's even better stuff than that. How about smarter traffic lights? Or do you enjoy idling at a red light while no traffic is present on the cross street?

Well said, sir, and I absolutely agree. But then one also has to consider carbon trading -- something that has no effect but to transfer money selectively into pockets that do nothing to earn it at the expense of nearly everybody in a way that even the proponents acknowledge will have no noticeable effect on CO_2 levels by the end of the century. One also has to do honest appraisals of the "catastrophic" costs that avoiding carbon based fuel supplies that are relatively cheap, plentiful, and reliable incur in the poorest countries in the world, where one of the primary things limiting the rise in the standard of living is energy availability. Not that this is easy when people are generating bullshit numbers like "400,000 people dying each year" due to (presumably anthropogenic) "climate change" and throwing them out into the public discourse, or asserting that the sea level is rising at alarming and unheard of levels, when there is no actual data to support them and a great deal to contradict them.

Increasing the efficiency of our usage of energy is just peachy and as you say often makes economic sense as well, although a lot of times it does not because of things like marginal cost and amortization. Is it better to use your gas guzzler (already paid for) for another five or ten years or buy a Prius at a price so high that even with the savings in gas you spend 50% more actual money? Spend $10,000 on a high efficiency AC/furnace (something I did last year) or eke another few years out of a much lower efficiency unit I already owned, when the amortized savings from the $10,000 investment will not be recovered in my lifetime and MAY not be recovered in the expected lifetime of the improved hardware? Buy a $30 LED 60W equivalent light bulb (containing e.g. arsenic almost certain to make its way into the environment eventually), a $4 CF 60W equivalent light bulb (that contains mercury that is almost certain to make its way into the environment eventually), or a $1 incandescent light bulb that you could grind up and eat safely, if you could chew glass and tungsten. When those costs also reflect a significant difference in the energy cost of manufacture and very likely a serious differential burden in the toxicity of manufacture.

So this isn't always obvious either, and we usually don't even have all of the data needed to make optimal solutions, only the data somebody wants to use to sell it to us so they can make money.

It is the latter mind set that trumpets Sandy as "proof" of climate change and worse storms when it is nothing of the kind either way -- numerous studies have shown that there is no discernible difference in the frequency or violence of storms (and if anything, we are in a confounding interval, continuing the longest stretch in recorded weather history without a major category 3 or higher hurricane making landfall in the US, where Sandy wasn't even a level 1 hurricane as it came ashore, it was a large extratropical storm). That doesn't mean that it wasn't destructive, only that a lot of its destruction was due to an event that was inevitable sooner or later happening to people who built as if it could never happen. No worries, people do that all of the time down south in the real hurricane alley, and sooner or later their houses get trashed when the inevitable serious tropical storm happens.

And it is always such a "surprise" when it does.

So yes, absolutely, sometimes people buy big cars to show off. Sometimes they buy big cars because big cars are way more useful than small cars. Sometimes people buy big houses to display their wealth. Sometimes they buy big houses because they are more comfortable and there is room for them to do more things. People often buy colorful clothes when they could just as well get by with drab, plain ones. In fact, we can imagine cutting civilization itself back to where all one gets in life is some minimal set of undyed clothing, a cot, and a monotonous but healthy diet of vegetables and starches while never travelling farther than the nearest field to work in to support all of this. Most of us would interpret that as being pre-historic poverty, not in contrast with showing off, but because life is more fun and better with colorful clothing, good food including meats and delicacies, easy and rapid transportation to wherever we wish, a nice bit of land with a good sized house on it, and all of the other trappings of wealth and civilization. Life sucks in third world countries where it is often horribly close to the former uncivilized description.

Solving the climate problem -- to the extent, still largely unproven or at least highly uncertain, that there is a climate "problem" to be solved -- without destroying the civilization that the climate problem supposedly threatens, that's the hard part. There are a lot of very reasonable measures that we can take that aren't too expensive and often have benefits anyway no matter how the bigger issue turns out. Then there are the unreasonable measures, the big, very expensive measures, that are being pushed on the basis of "panic" for a world supposedly "on pace for 4C warming by the end of the century" as the top article asserts. It is no such thing, not even according to the IPCC (who are hardly objective in the matter as it is), so when somebody involved in big money and banking and big investments says that it is -- hold on to your wallet.

rgb

Comment Re:I save money! (Score 2, Interesting) 439

Because starting in the middle of the Roman Warm Period, the Holocene Optimum, or the Medieval Warm Period would be too confusing. Might as well start in 16000 BCE -- that's a good time, d'you think? If we fit a straight line fit from there, it predicts what, 10 or 15C of warming over the next 10000 or so years. You tell me what the signal is, and what is the noise, using YOUR favorite cherrypicked interval, or we could look at the entire dataset back to the Ordovician-Silurian transition (Ice age that began when CO_2 was 7000 ppm, almost 1 percent CO_2) and stop worrying so much.

rgb

Comment List... (Score 1) 951

All of them. In particular WoW and Diablo X, but really all of them. Yes, I've run WoW under Cedega, yes I've run Diablo on VMs, yes one can run them under Wine, but I want true native code, not emulated, simulated, virtualized. I'm looking forward to maybe -- just maybe -- seeing this happen over the next 3 years. Games with actual accelerated high density graphics, compiled for and running on Linux. What a concept.

rgb

Comment Re:I save money! (Score 2) 439

Right, and the 33 year trend is just over 0.13/decade, and the 15 year trend is flat, and the ten year trend is slightly negative, which pretty much makes nonsense of the 0.04 C/decade "at the 95% confidence level" to the extent that fitting any curve you like to a selected (or if you prefer, "cherrypicked" segment of a highly variable dataset makes sense in the first place. The predictive value of any of these fits is diddly, joined by its friend squat.

My point is that that there is precisely zero evidence in the form of a fit to actual data with any meaningful confidence level for a 4 C rise over the century. Even the IPCC AR5 isn't going to come close to that -- it is dropping its predictions to ~2.5C/Century, and every year with neutral temperatures will drop it further still. If one compares the actual temperature record to Hansen's early predictions over the last 35 or so years, the temperature curve is coming in slightly below his "no feedback" extrapolation, indicating neutral to perhaps slightly negative net feedback. His strong feedback curve is positively rejected. His intermediate feedback curve -- one that leads to the 2.5C/century type of warming -- is very inconsistent with the data but because the natural variability of the climate is basically not well known it leaves open the possibility that the current 15 year levelling might return to a strong warming trend at some point.

So it isn't just "sites on the internet" -- one of the most reliable sources of data available over the last 33 years (and arguably one of the only sources that is both truly global and not susceptible to various forms of bias known to corrupt the thermometric record) is absolutely inconsistent with a 4C/century warming trend, which is out at the very-low-confidence limit of the current AR report in progress. So the top article is pretty much alarmist nonsense.

IMO, the most likely century-scale warming we might expect based on the data is between 1 and 2 C. That is entirely consistent with the warming expected from CO_2 only, plus neutral to week feedback or climate sensitivity. It isn't very likely -- nothing like bullshit "95% confidence" levels -- because we still don't know and cannot predict most of the important natural variation in the climate and do not understand the feedbacks between things like aerosols, ozone, CO_2, cloud formation and the coupling of the climate to things like the phase of the major atmospheric oscillations and oceanic currents or the sun. The climate could indeed warm by 2.5C, or even 4 or 5C (unlikely to very unlikely). It could also actually cool some, or warm less than a degree. We cannot even be certain of what the CO_2 levels will be then regardless of the steps "actively" taken to ameliorate it. If somebody invents "cold fusion" (or hot fusion) at commercially viable efficiency, or the world starts to get off its thumb and build e.g. liquid thorium fission plants it would clearly make a large difference. Whether or not these things happen, in 10 to 20 years PV solar is going to overtake just about everything in terms of cost-efficiency, sooner if somebody invents a really good battery. That too will have a large impact. So even predictions that begin "assuming a doubling of CO_2" are simply adding a Bayesian condition to the probability distribution of final temperatures that is somewhat dubious -- we might well never reach 600, or even 500, ppm before it starts to fall back

So as the Hitchhiker's Guide says, Do Not Panic. And hold onto your wallet while not panicking.

rgb

Comment Re:Hmmm (Score 1) 272

What operating system the computer (because that's what a game console is, a computer) is running is irrelevant. Everything you are describing is operating system agnostic, or nearly so -- you're just talking about the toplevel interface, the windowing system as it were. Linux can run anything from a TTY-only interface, to a TTY interface with a single graphical application, to a full blown windowing system with a general purpose window manager like gnome or kde. At one point, so could Windows (back when it was still DOS inside). Apple's OS originally could not -- it was graphical all the way down into the kernel, for some bizarre reason -- but probably can now that it is really Unix. I'm guessing Windows can as well, but it is probably a lot harder to get at a TTY-only console in Windows 8...

Your game console absolutely needs a multitasking operating system (and has one). We could go down a list of things it manages "in parallel" via multitasking -- managing network interrupts, handling disk or other media access during game play, coping with the human interface, the datastream coming in from some remote network connection, and the outgoing datastream back to the game server, all the while updating the screen every 60th of a second or so -- but we'd still only list a small fraction of the housekeeping visible when running "top" in a Linux computer running a single graphical task. And then there are modern processors that are almost without exception multicore (core-level coarse grain parallelism is multitasking) not to mention the fact that each core is effectively itself parallelized with multiple ALUs and execution threads that are all managed by multitasking.

What you mean is that you want the top level game interface to not support or run lots of single user applications at once. You want the entire graphical console interface to be devoted to a single task, selected from a relatively short list of single tasks, so that (as you correctly put it) "whatever program is running should have full reign over the console so it can take full advantage of the hardware".

To be honest, I think you are overreacting -- if one plays, e.g. World of Warcraft on a linux box and doesn't start a half dozen things on different desktops right before beginning, that's pretty much what you get already. I realize that you are annoyed with your phone as a game interface -- of course you are, as the phone is a phone first and doesn't stop being a phone while you play the game. It also has a relatively lousy network interface no matter whether it is connecting through 3G or 4G or Wireless -- the latter with iffy reception on heavily shared lines, the latter burning power like crazy. Phones suck as game consoles, no arguments.

Android tablets, OTOH, are much better. They typically do for the most part run just one application at a time on the console and don't necessarily do a lot of background user task execution while they do (the thing you really object to, not multitasking) although one CAN run a user task in the background on an android tablet and degrade game performance if you want to for some reason.

Regular computers, however, -- in my opinion -- are in the end the best gaming platforms out there. The offer you the following benefits:

a) A minimum of two cores, more likely four, with very large CPU caches. The intel 3rd gen i7 is truly awe inspiring in its performance -- it can manage 8 simultaneous contexts so smoothly that I've measured linear performance scaling on floating point intensive code out to 8 tasks even though the processor itself is only four cores. This is the ultimate anodyne for your "multitasking" concerns -- a computer that could be evaluating pi to ten zillion places, checking the prices of your stocks in real time, playing tic-tac-toe with itself out of sheer core boredom, and be running the game you are playing flawlessly with human-perceptually instantaneous response time.

b) Enormous amounts of RAM. Yes, this matters for performance. Huge disks (Terabytes!) to store all of those huge games and movies. A computer has "more of everything" compared to any game console, and usually faster of everything except MAYBE the GPU, for the brief moment a dedicated game console represents the bleeding edge of current gen GPUs.

c) Either unlimited (plugged into wall) or much less limited (laptop) battery life, although the latter still sucks, sigh.

d) Vastly better graphics. Yes, in a game console it is dedicated, but money counts. In a PC you can basically invest as much in the graphics card alone as most dedicated game consoles cost. The high end nvidia cards have cone-head quantities of graphical memory and GPUs that are themselves capable of running whole operating systems -- more multitasking and parallelism as the PC operating system can offload much of the floating point intensive graphical processing to the GPU.

e) Choice . You don't want to run anything but the game, the game, and nothing but the game. Or web browser. Fine. That's your choice, but the computer itself is happy doing more. It will let you run just the game. Or, you can use the same computer to run your game today, and a numerical task tomorrow, or the game and the task at the same time (very likely without noticably degrading the performance on the game).

It is useful to remember the following fact. A modern computer clock is around 0.3 nanoseconds. Human response time is order of 0.1 seconds, and human eyes and brains are utterly incapable of seeing graphical roughness at a time scale of 0.01 seconds and less. That means that computers have time for tens of millions of instructions to be executed in between almost anything being done via a human-connected interface -- input from mouse or joystick devices or keyboards, output to update a screen image. Some operating systems or gaming pathways are not well-tuned to games and can introduce a human-noticeable "glitch" -- for example, when playing games under a VM, where the VM yields the CPU back to the toplevel OS on a single core system -- but this isn't about the hardware OR the operating system per se, it is about tuning.

So the top article is right on the money. For decades at this point, the primary obstacle to the wider-spread adoption of linux has been its relatively poor ability to play mainstream games. Windows has utterly dominated the PC gaming category, and for much of that stretch has not been terrible well tuned to play games (giving dedicated consoles an "in"). Valve/Steam sound like they have the capability to change that, and thereby change the economics of gaming almost instantly. A truly portable gaming API and library that supports the high performance graphics modern games require would be a no-brainer decision for most developers, especially if the interface itself were easier (or at least, no worse) than existing gaming development platforms and/or easy to port to. If companies like Blizzard start to write to and with it (so WoW comes out linux native in its next incarnation) it will have a huge impact.

For one thing, it will enable the development of the linux game console, a dedicated set-top game console just like a playstation or xbox, only far cheaper and user configurable, just like there are set-top boxes that provide access to e.g. netflix and hulu now that run linux. And that will be interesting times indeed.

rgb

Comment Re:I don't get it (Score 4, Insightful) 105

Just to actually answer your question, the original inflation of space (supposedly) took only a very, very short time, so even if the two points were "close together" at the instant of the big bang itself, they ended up very far apart (and moving farther apart) at the end of a second or so. The parts of the universe in question did not exceed the speed of light because speed is distance over time in spacetime and it is the latter that was inflating. Think of a very small balloon with a picture of the Universe printed on its surface being suddenly blown up -- when the balloon is small, everything is compact, but when it is inflated it is much further apart. Then make it a balloon with a three dimensional "surface" and no interior...

There is a lot more to learn about this, much of it in e.g. wikipedia pages as noted in the thread or in astronomy textbooks, and it is actually a lot of fun to learn. One very interesting thing, for example, is to follow the scientific argument from parallax, blackbody radiation, and our knowledge of how radiation intensity drops off with distance, through the discovery of the Hubble constant, out to how we estimate/compute the size and age of the Universe. Another interesting thing is to learn about "the Great Dark" that followed the big bang up until the formation of the earliest stars some 200 million years later, the chain of nucleosynthesis within those starts and the supernovae that ended them, and the gradual accumulation of "metals" (elements heavier than hydrogen and helium) in the ashes of those stars. The entire planet Earth and we ourselves are composed of stardust, the ash of ancient stars that gave rise to the elements that make up our bodies in their dying explosions.

It's well worth it to take a course in astronomy at some point if this sort of thing interests you, although a lot of it is covered in discovery channel stuff and shows you can probably find on netflix if that's too time or money consuming for you.

rgb (who occasionally teaches astronomy and hasn't lost his sense of wonder at how it all works out)

Comment Re:Enough Gaming (Score 1) 227

Not at all. I'm saying that when the people who are making the most noise and asserting things like "storms are more violent and frequent and it is due to global warming" (an argument that it is difficult to make at the moment especially with hurricanes, given that we are continuing the longest stretch in recorded history without a category 3 or better hurricane making landfall in the US, given that unbiased statistical analyses show no such thing) are also making a lot of money off of the panic that they create -- panic that isn't based on AGW per se (which might well exist but be moderate and at least partly beneficial), it is about catastrophic AGW, doomsday stuff -- it might be wise to take a very close look at who the big winners are in the "game" described above and be at least a little bit cynical about their motives and the validity of their public claims.

The losers are easy to find. It's everybody else, including you and me, because if CAGW is correct, carbon trading is a complete and expensive waste of time (because even its proponents don't think it will make any real difference in the CO_2 levels around 2100 -- not without destroying human civilization in the meantime, baby with bathwater), and if CAGW is not correct, it is a complete and expensive swindle, one that diverts an enormous amount of resources away from where they might be far better used to, say, bring 2/3 of the world up out of poverty and ignorance.

Real solutions to carbon, like building lots of e.g. thorium salt nuclear power plants, seem anathema, while elsewhere people get rich trading virtual rocks. World of Warcraft gold-farming translated into a hundred billion dollar business that enriches people who do nothing to earn the money, leaves others in abject poverty living in huts lit and warmed with burning cow dung, and without even solving the original problem and "saving the world" (if it is, in fact, in danger). It takes real talent to do that.

rgb

Slashdot Top Deals

Science and religion are in full accord but science and faith are in complete discord.

Working...