Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment thanks FAA.... (Score -1) 42

A lot of the "drone" development has already shifted out of the USA due to shitty government handling of the entire industry. Australia and Canada already have working policy. THIS is what we mean when we say government regulations are stifling the economy and making us less competitive. But then, I am a conservative republican, so fuck me, right?

Comment Re:Simple (Score -1) 222

Aaaaaand they don't have to pay for their own maintenance, WE DO. My power company made an "agreement" with the state to be more "green" and one of the only ways to meet the target was to PURCHASE a shitty farm of now non tax break funded wind turbines. Woohoo, I am buying that shit whether I want to or not, the definition of a liberal cause.

Comment It is all a plot against minorities and women.... (Score -1) 496

Fuck man, the Seattle area population is 3.61 million, of which 45k might work for Amazon. Sounds different now huh? And the entire NW united states if pretty fucking lily white, so we should bus the minorities in? Recruit outside the area exclusively? There is no conspiracy, Amazon would hire green fucking monkeys if it could make money off them. So please women, take the job and then work until you die, don't have kids, and don't sue your employer. You will fit right in.

Comment Re:Science creates understanding of a real world. (Score -1) 770

Climate Change Denier!? You are a Climate Fact Molester then sir. You obviously do not even understand the debate. Everyone on WUWT, JoNova, ClimateAudit, and ClimateETC, DO NOT arbitrarily embrace an opposite. They merely object to so called "scientists" voting amongst themselves that the debate is over than their particular view of their "models" constitutes irrefutable fact. That in the last 10k year over the Holocene, with it's multitude of temperature excursions up and down, whatever made it do it then is simply POSSIBLE that it is doing it now. But we are clubbed over the head with 97% horse shit by people looking only to use this for political purposes, read "get into my wallet". Reprint from one of those dumb ass blog guys: rgbatduke says: February 7, 2014 at 10:34 am A) The increase in temperature we have experienced during the 20th century is nothing unusual and is quite normal, and, B) the rain and storms suffered by the people of the UK are also nothing unusual. A) Which half? The increase in the first half of the 20th century is almost identical to the increase in the second half. The two halves are so nearly identical in form that unless you have studied them enough to be able to pick out specific features, you won’t be able to tell which one occurred with the hypothetical help of CO_2 and which one occurred without the hypothetical help of CO_2 when they are plotted on the same vertical relative scale and the same horizontal relative scale but with the actual dates obscured. In the first half of the 20th century, not even the most ardent warmists claim that there was enough anthropogenic CO_2 in the atmosphere to have any measurable effect. The global industrial revolution that started the CO_2 crank was 1950s on, and there was supposedly a lag of 30 years before that had any effect (to explain the fact that through the 50s, 60s, and early 70s the temperature was pretty close to flat, which didn’t fit in well with the instantly well-mixed, instantly more strongly forcing picture of CO_2 emissions. So as a matter of pure fact, the increase in temperature experienced during the 20th century was not unusual or abnormal in any way that can be definitively linked to anthropogenic activity as far as we can tell from the data! We had little to no impact on the first half, the warming in the second half matched that of the first half (with our hypothetical help), both halves were part of a perfectly reasonable continuing century-scale rebound from the lowest temperatures experienced on Earth since the Holocene Optimum during the Little Ice Age. It’s amazing how ignorant people who participate in this debate with total certainty that our climate is unusual are of the “patient’s” history. I like to keep the patient’s chart for the last 12,000 years handy to help them learn: http://commons.wikimedia.org/w... Note well, this is smoothed. Note also that the error bars (never, ever shown in climate science) are probably as wide as the total variability envelope of all contributing reconstructions — an easy 1 to 2 C. As Lief pointed out above, reconstructing things like solar activity or temperature in the pre-instrumental era is neither easy nor precise, and the tiniest hint of bias or prior belief in the part of the researcher can effortlessly further cloud the proxy-based extrapolations by causing them to make countless small, almost harmless decisions that ultimately are cherrypicking of the data, comparing low temporal resolution data to high temporal resolution data to make erroneous statements about extremes, or ignoring the possibility of confounding causes or degradation of the data sources in those sources that match their “preferred” narrative at the expense of those that do not. If you count the assumptions — most of which cannot possibly be verified in the present — that go into reconstructions, there are many and each one contributes to increased uncertainty in the final claim. Still, taking it for what it is worth — a possibly accurate reconstruction of the planet’s temperatures in the Holocene (post the Wisconsin glaciation, but including the Younger Dryas) that is at any rate the best we can do with the data and methods available (biased or not) at this time, what does it tell us? First, the climate now is not warmer than it was in the Holocene Optimum (do not make the mistake of conflating the high frequency, high resolution “2004 data point with the smoothed low frequency, low resolution data in the curve — even the figure’s caption warns against doing that — for the very good reason that in every 300 year smoothed upswing it is statistically certain that the upswing involved multidecadal intervals of temperatures much higher than the running mean. It is left as an exercise to the studio audience to figure out how to use contemporary high frequency climate data to make a numerically reasonable estimate of how much warmer than the smoothed average peak multidecadal intervals almost certainly were during the warming intervals seen throughout this graph. Goodness, I think it is easily 1+C, isn’t it! Second, the LIA really was the coldest smoothed temperature interval in the entire Holocene. It was 11,000 years, during the warming phase that pulled us out of the glacial era, since the planet was as cold as it was in the general stretch from 1400 to 1900, embracing both the LIA and the following Dalton minimum. During this interval Earth’s glaciers grew, strongly, but do not mistake the level of glaciation observed in (say) 1870 as being normal. For most of the gradually cooling Holocene, it would have been extremely abnormal. As I said, the patient fell through the ice into the river, and probably came within a hair of falling under the ice, to be trapped in another 90,000 year cycle of glaciation before the next interglacial. Third, note well our profound degree of ignorance as to the cause of any of the features on this curve, granting that they are “features” at all — remember, the noise is on the same scale as the signal so every single bounce on this curve could be noise as far as the data is concerned. It is only what amounts to anecdotal reporting in human histories that gives us the opportunity to at least partially affirm events like the Roman Warm Period, the Medieval Warm Period, the Minoan Warm Period — bobbles visible on this curve that do in fact correspond with historical times where there is direct evidence of warm temperatures, favorable climates for agriculture even at high latitudes and indeed generally benign climate conditions. Throughout human history, warm intervals have been the best of times, even when they were warmer than it is today, and the many eco-disasters being projected on the basis of failing climate models did not, in fact, occur even when the climate was much warmer than it is today. Human models cannot predict any portion of this curve. We have the crudest of hypotheses that collectively might explain the glacial/interglacial pattern of the Pliestocene (including the Wisconsin and Holocene) in terms of long term coincidences in orbital eccentricity, axial tilt as the planet’s axis precesses, oscillations up and down in the plane of the ecliptic, the movements of the continental plates, the consequent (sometimes “dramatic”) variation in oceanic circulation, against a background of volcanic activity that can “punch” the system with a mix of rapidly varying aerosols, soot and greenhouse gases with unknown but possible heavily delayed feedback from the plastic motion of the Earth’s crust itself in response to growing or melting glaciers and the associated changes in high latitude albedo. Sure. Probably, even. But try computing on the basis of these collective hypotheses and then predicting the future unforced climate, in a chaotic system with strong, nonlinear, internal feedbacks on all of the shorter timescales driving temperatures up or down by degrees C completely independent of the 1000 year plus timescale drivers. We are fortunate that the climate was, and probably still is, rebounding from the LIA completely independent of CO_2. Even if CO_2 is the fed-back devil the most catastrophic of warmists asserts it to be, the general trend of the Holocene towards overall cooler temperatures might make it the more desired of two evils; warming up to Holocene Optimum temperatures is surely far less likely to be destructive than any sort of plunge in temperatures, whether to LIA cold-but-manageable temperatures or to the next glacial era. Evidence from past glaciations suggests that the Earth can kick over into rapid glacier growth in as little as a century and literally plunge back into the deep freeze and kilometer-thick ice down to the latitude of New York or Pennsylvania. There is one other graph that is entertaining to look at. The chart up above, revealing as it is, is only the chart for the last few days of the patient’s life. Here’s the chart for the patient over the post-adolescent years (the last 550 million years): http://commons.wikimedia.org/w... Goodness, what does this chart tell us? That there is one single interval in the last 600 million years when the Earth has been as cold as it is today, climate wise. This is on a scale that makes the Pliestocene ice age that we are currently in seem small. The Earth’s temperature has been systematically dropping for since the end of the Cretaceous, and is currently colder (climate wise) than any interval since the Ordovician-Silurian transition, which (incidentally) began with atmospheric CO_2 levels of 7000 ppm — almost a full percent CO_2, and reached its minimum temperatures with sustained atmospheric CO_2 levels of 4000 ppm — 10 times the levels we have today. For this, we don’t even have a hypothesis — we have mere science fiction (the solar system drifting through an enormous cloud of space dust, asteroid impacts, fill in the blank). We have no possible way to go back in time to observe, and no credible way to obtain data on any of the myriad of possible causes. You can click through the last 65 million years — bad news all the way — to 5 million years, where we note that our warmest temperatures in the interglacials are still not as warm as the mean, stable temperatures of 3.5 million years ago by 1 to 2 C, and that the interglacials themselves are a decreasing fraction of the time up to the present, where currently we spend 80 to 90 thousand years locked up in earth-crushing glaciers compared to 10 thousand years of interglacial, where the ice retreats to the point where humans can thrive. All of human civilization arose in the Holocene interglacial. Think on that. So yes, Gareth, the current temperature variations are completely normal as far as we can tell, although the patient is still suffering from serious hypothermia and was rather chilled even before falling through the ice. Right now the patient’s core temperature is probably still depressed compared to whatever might laughably be called “normal” in a nonlinear chaotic climate system being constantly driven around Poincare cycles between climate extremes, jumping around between attractors, as the dynamical evolution of coupled Navier-Stokes equations on a spinning, tipped, precessing oblate spheroidal 70% ocean-covered ball with land mountains that reach up to the top of the troposphere and in an highly eccentric orbit around a moderately variable star continues. It’s like after discovering chaos in weather systems, the entire climate science community forgot all about it. Jeeze. B) The rains and storms suffered by the UK people are nothing unusual? Seriously? That’s the reason you believe in CAGW, because England got a lot of rain, compared to what, exactly? Can you not tell the difference between weather and climate? Hey, the people in North Carolina have just suffered through record setting cold! That must be evidence of global cooling! Wait, we have also been blessed with a record setting dearth of high energy Atlantic hurricanes (no category 3 or higher storms have made landful in the US for an interval that actually has a chance of doubling the previous record, if it lasts another year or two). That proves what, exactly? That the weather is highly variable? That the (wait for it) climate is changing? What part of non-stationary process is so very difficult to understand? The climate is always changing. Look at the graphs I linked up above. Intervals of climate stability on a geological time scale are rare; the climate is usually changing. That’s normal. The Pliestocene has been an entire, continuing ice age in which the climate has been rapidly changing, cycling between extremes where the warmest of temperatures are only a degree or two warmer than the present and sometimes last at those “fevered” conditions only for a century or two where the “fever” in question is hypothermia compared to the bulk of the geological record. None of this is really a matter for much discussion. It is built right into the openly published, reasonably accepted graphs. There is, quite literally, nothing unusual about the present climate that we can detect from the data. In order to even hypothesize that the present climate is abnormal, we would have to be able to predict the normal (unforced) climate for the present. This, we cannot do, either for the present or via hindcast the past. We are cosmically clueless about how to predict the climate. The computational problem is mind-bogglingly difficult, and it isn’t even vaguely surprising that the best GCMs we can build so far are all failing, for the dual reason that they don’t have the spatial and temporal resolution that is almost certainly required to do a halfway decent job and the fact that the models were built and initialized by individuals who sincerely believe that CO_2 is driving the climate with strong feedbacks and that this climate forcing is the dominant factor in predicting the future climate. They fit the models to the last third of the 20th century and the assumption that the warming observed in this interval was dominantly anthropogenic and wonder why they fail. Look without bias at the entire climate record, my friends. Does anyone seriously think that the GCMs can track/predict/hindcast the variations of any significant length in overall climate record using only the values set from fitting e.g. 1970 through 2000 with an assumption of strong feedback CO_2 forcing and negligible natural variation outside of this? Really? Rgb rgbatduke says: February 7, 2014 at 11:03 am A figure, Richard, that nicely illustrates the indistinguishability of first half and second half 20th century warming in e.g. HADCRUT4. I can pick out which is which by looking for the Pinatubo and 1997-1998 ENSO features at the end — but I study these curves. But qualitatively or quantitatively? They are identical well within the uncertainty. One with CO_2, one without. This alone confounds any assertion that we can be certain that the late 20th century warming was either abnormal or necessarily forced by CO_2. What “forced” the warming observed in the first half of the 20th century, I wonder? Maybe nothing. Maybe this is all just natural variation of a chaotic weather system where for a century or so positive feedbacks win, then for a century or so negative ones win, against the slowly varying but really rather unpredictable projection of Milankovitch axial tilt and orbital eccentricity against the irregular shapes of the continents at perihelion and aphelion and maximum NH insolation vs maximum insolation, although all of this is causing changes in just what warms and cools where and when as the continents themselves precess underneath the secular orbital motion. And then there are the multidecadal atmospheric circulation oscillations (ENSO, the PDO, the NAO, etc) with their different and ill-defined (chaotic!) “periods”. And then there are the largely uncharted decadal scale variations in ocean currents and thermohaline circulation, with its myriad of linked processes that we don’t really understand. Want to trigger the next ice age? Divert the gulf stream so it hits Europe five hundred miles south of where it currently goes. Just 500 miles. See what that does to England’s weather, to the arctic ice pack, to the Siberian winter., to the temperature of the Mediterranean, and as positive feedback occurs, to North America and global temperatures in general. The equator would get hotter, lose heat more efficiently, and the poles would get very cold, very quickly! Probably. Really, of course, I don’t know for sure. One cannot properly eyeball and guess the solution to the Navier-Stokes equation “as if” some change occurs. But there is some evidence that Atlantic oceanic circulation patterns can dramatically and rapidly change global climate, and this is offered as one possible “sufficient” explanation for the Younger Dryas — the breaking of an ice dam in the melting glaciers and the draining of a huge freshwater lake that interrupted the Atlantic thermohaline circulation for almost a thousand years. If/when the Atlantic oscillation finally inverts, we may get to find out. The change of the PDO phase is correlated with (but possibly not causal of) “the pause” and substantially increasing ice and cold in Alaska (for example). Perhaps NAO etc. phase oscillations will have a similar effect on the north atlantic, perhaps they will even divert or cool the Gulf Stream. Tiny changes there could spell serious trouble for Europe. Rgb I have tried to be respectful to Gareth as I have offered refutation of his metaphors and belief that bad weather in one part of England is proof of climate change in a non-stationary system. Note that I’m not asserting that his belief is delusional, only that he is conflating weather with climate, something that the press openly encourages by reporting every single “extreme” weather event around the world as proof of climate change in spite of the fact that the normal climate sets records of extreme weather every day somewhere on the planet (unsurprising, given the miniscule interval over which we have records to find extrema within and the enormous number of locations). Citing individual weather extremes to prove climate change is a form of egregious cherrypicking, and is why we do not trust anecdotal claims to support hypotheses in science. Now if one looks up the work of the Pielkes, who IIRC study precisely that — the statistical distribution of extreme weather events of various sorts — you will learn that there is no statistically defensible evidence for an increase in the violence, frequency, energy, or any other metric of extreme weather. No there are not more hurricanes than usual. Nor tornados. Nor are hurricanes on average stronger. Nor tornadoes. Nor are there excessive numbers of droughts (that’s a hard record to reach, by the way, as there have been some doozy droughts in the past!). Or floods. There is absolutely no statistically defensible reason to think that the weather is getting worse, anywhere. This is a simple matter of fact. If you disagree, Gareth, then I would respectfully ask you to produce the study the finds otherwise, and look carefully at the degree of significance for the claim. Given the number of times people look for some statistically significant measure of “climate change” presumably caused by humans, it is literally inevitable that they will find some metric or another where there is a “significant” effect. This, however, is the result of data dredging (look it up, Wikipedia will educate you as will the xkcd comic “Green Jelly Beans cause Acne”). Data dredging is simply using statistics to put a patina of respectability on good old anecdotal evidence by finding SOME anecdotal evidence SOMEWHERE that is statistically significant while ignoring all of the rest that shows no effect whatsoever or even a negative effect (e.g. cat 3 storms making landfall in the Atlantic basin). In the meantime, perhaps we could all hold off on the name calling, invective, ad hominem, and so on. I very much doubt that Gareth is trying to fool us or that his beliefs are not sincere. I merely challenge those beliefs. Are they defensible?

Comment Johnny Taxhaber..... (Score -1) 116

Of course our reelected governor Kitzhaber was an emergency room doc in an earlier life, so thinks he knows medicine. Unfortunately that did not translate to government or anything more technical than a Simon game. These types of government contracts are written specifically so there can be no real responsibility attached, and this is really an exclusive government problem. Do you think Intel and Nike in Oregon write contracts without performance clauses and penalties? These contracts are designed this way so they can be abused. But hey, we deserve Johnny K. When he first became governor in 1995 our state budget was $10 billion. Now it is over $30 billion a year for 3.8 million of us plebes. Additionally, if you work (using this word loosely) for the state, city, county and are due to retire before 2026 with 30 years in, as of today you will receive 110% of your base pay in retirement, forever. Oregon is a wonderful place, we just have a severe lack of people pulling the wagon versus in the wagon.

Comment relevant..... (Score -1) 335

rgbatduke says: February 7, 2014 at 10:34 am A) The increase in temperature we have experienced during the 20th century is nothing unusual and is quite normal, and, B) the rain and storms suffered by the people of the UK are also nothing unusual. A) Which half? The increase in the first half of the 20th century is almost identical to the increase in the second half. The two halves are so nearly identical in form that unless you have studied them enough to be able to pick out specific features, you won’t be able to tell which one occurred with the hypothetical help of CO_2 and which one occurred without the hypothetical help of CO_2 when they are plotted on the same vertical relative scale and the same horizontal relative scale but with the actual dates obscured. In the first half of the 20th century, not even the most ardent warmists claim that there was enough anthropogenic CO_2 in the atmosphere to have any measurable effect. The global industrial revolution that started the CO_2 crank was 1950s on, and there was supposedly a lag of 30 years before that had any effect (to explain the fact that through the 50s, 60s, and early 70s the temperature was pretty close to flat, which didn’t fit in well with the instantly well-mixed, instantly more strongly forcing picture of CO_2 emissions. So as a matter of pure fact, the increase in temperature experienced during the 20th century was not unusual or abnormal in any way that can be definitively linked to anthropogenic activity as far as we can tell from the data! We had little to no impact on the first half, the warming in the second half matched that of the first half (with our hypothetical help), both halves were part of a perfectly reasonable continuing century-scale rebound from the lowest temperatures experienced on Earth since the Holocene Optimum during the Little Ice Age. It’s amazing how ignorant people who participate in this debate with total certainty that our climate is unusual are of the “patient’s” history. I like to keep the patient’s chart for the last 12,000 years handy to help them learn: http://commons.wikimedia.org/w... Note well, this is smoothed. Note also that the error bars (never, ever shown in climate science) are probably as wide as the total variability envelope of all contributing reconstructions — an easy 1 to 2 C. As Lief pointed out above, reconstructing things like solar activity or temperature in the pre-instrumental era is neither easy nor precise, and the tiniest hint of bias or prior belief in the part of the researcher can effortlessly further cloud the proxy-based extrapolations by causing them to make countless small, almost harmless decisions that ultimately are cherrypicking of the data, comparing low temporal resolution data to high temporal resolution data to make erroneous statements about extremes, or ignoring the possibility of confounding causes or degradation of the data sources in those sources that match their “preferred” narrative at the expense of those that do not. If you count the assumptions — most of which cannot possibly be verified in the present — that go into reconstructions, there are many and each one contributes to increased uncertainty in the final claim. Still, taking it for what it is worth — a possibly accurate reconstruction of the planet’s temperatures in the Holocene (post the Wisconsin glaciation, but including the Younger Dryas) that is at any rate the best we can do with the data and methods available (biased or not) at this time, what does it tell us? First, the climate now is not warmer than it was in the Holocene Optimum (do not make the mistake of conflating the high frequency, high resolution “2004 data point with the smoothed low frequency, low resolution data in the curve — even the figure’s caption warns against doing that — for the very good reason that in every 300 year smoothed upswing it is statistically certain that the upswing involved multidecadal intervals of temperatures much higher than the running mean. It is left as an exercise to the studio audience to figure out how to use contemporary high frequency climate data to make a numerically reasonable estimate of how much warmer than the smoothed average peak multidecadal intervals almost certainly were during the warming intervals seen throughout this graph. Goodness, I think it is easily 1+C, isn’t it! Second, the LIA really was the coldest smoothed temperature interval in the entire Holocene. It was 11,000 years, during the warming phase that pulled us out of the glacial era, since the planet was as cold as it was in the general stretch from 1400 to 1900, embracing both the LIA and the following Dalton minimum. During this interval Earth’s glaciers grew, strongly, but do not mistake the level of glaciation observed in (say) 1870 as being normal. For most of the gradually cooling Holocene, it would have been extremely abnormal. As I said, the patient fell through the ice into the river, and probably came within a hair of falling under the ice, to be trapped in another 90,000 year cycle of glaciation before the next interglacial. Third, note well our profound degree of ignorance as to the cause of any of the features on this curve, granting that they are “features” at all — remember, the noise is on the same scale as the signal so every single bounce on this curve could be noise as far as the data is concerned. It is only what amounts to anecdotal reporting in human histories that gives us the opportunity to at least partially affirm events like the Roman Warm Period, the Medieval Warm Period, the Minoan Warm Period — bobbles visible on this curve that do in fact correspond with historical times where there is direct evidence of warm temperatures, favorable climates for agriculture even at high latitudes and indeed generally benign climate conditions. Throughout human history, warm intervals have been the best of times, even when they were warmer than it is today, and the many eco-disasters being projected on the basis of failing climate models did not, in fact, occur even when the climate was much warmer than it is today. Human models cannot predict any portion of this curve. We have the crudest of hypotheses that collectively might explain the glacial/interglacial pattern of the Pliestocene (including the Wisconsin and Holocene) in terms of long term coincidences in orbital eccentricity, axial tilt as the planet’s axis precesses, oscillations up and down in the plane of the ecliptic, the movements of the continental plates, the consequent (sometimes “dramatic”) variation in oceanic circulation, against a background of volcanic activity that can “punch” the system with a mix of rapidly varying aerosols, soot and greenhouse gases with unknown but possible heavily delayed feedback from the plastic motion of the Earth’s crust itself in response to growing or melting glaciers and the associated changes in high latitude albedo. Sure. Probably, even. But try computing on the basis of these collective hypotheses and then predicting the future unforced climate, in a chaotic system with strong, nonlinear, internal feedbacks on all of the shorter timescales driving temperatures up or down by degrees C completely independent of the 1000 year plus timescale drivers. We are fortunate that the climate was, and probably still is, rebounding from the LIA completely independent of CO_2. Even if CO_2 is the fed-back devil the most catastrophic of warmists asserts it to be, the general trend of the Holocene towards overall cooler temperatures might make it the more desired of two evils; warming up to Holocene Optimum temperatures is surely far less likely to be destructive than any sort of plunge in temperatures, whether to LIA cold-but-manageable temperatures or to the next glacial era. Evidence from past glaciations suggests that the Earth can kick over into rapid glacier growth in as little as a century and literally plunge back into the deep freeze and kilometer-thick ice down to the latitude of New York or Pennsylvania. There is one other graph that is entertaining to look at. The chart up above, revealing as it is, is only the chart for the last few days of the patient’s life. Here’s the chart for the patient over the post-adolescent years (the last 550 million years): http://commons.wikimedia.org/w... Goodness, what does this chart tell us? That there is one single interval in the last 600 million years when the Earth has been as cold as it is today, climate wise. This is on a scale that makes the Pliestocene ice age that we are currently in seem small. The Earth’s temperature has been systematically dropping for since the end of the Cretaceous, and is currently colder (climate wise) than any interval since the Ordovician-Silurian transition, which (incidentally) began with atmospheric CO_2 levels of 7000 ppm — almost a full percent CO_2, and reached its minimum temperatures with sustained atmospheric CO_2 levels of 4000 ppm — 10 times the levels we have today. For this, we don’t even have a hypothesis — we have mere science fiction (the solar system drifting through an enormous cloud of space dust, asteroid impacts, fill in the blank). We have no possible way to go back in time to observe, and no credible way to obtain data on any of the myriad of possible causes. You can click through the last 65 million years — bad news all the way — to 5 million years, where we note that our warmest temperatures in the interglacials are still not as warm as the mean, stable temperatures of 3.5 million years ago by 1 to 2 C, and that the interglacials themselves are a decreasing fraction of the time up to the present, where currently we spend 80 to 90 thousand years locked up in earth-crushing glaciers compared to 10 thousand years of interglacial, where the ice retreats to the point where humans can thrive. All of human civilization arose in the Holocene interglacial. Think on that. So yes, the current temperature variations are completely normal as far as we can tell, although the patient is still suffering from serious hypothermia and was rather chilled even before falling through the ice. Right now the patient’s core temperature is probably still depressed compared to whatever might laughably be called “normal” in a nonlinear chaotic climate system being constantly driven around Poincare cycles between climate extremes, jumping around between attractors, as the dynamical evolution of coupled Navier-Stokes equations on a spinning, tipped, precessing oblate spheroidal 70% ocean-covered ball with land mountains that reach up to the top of the troposphere and in an highly eccentric orbit around a moderately variable star continues. It’s like after discovering chaos in weather systems, the entire climate science community forgot all about it. Jeeze. B) The rains and storms suffered by the UK people are nothing unusual? Seriously? That’s the reason you believe in CAGW, because England got a lot of rain, compared to what, exactly? Can you not tell the difference between weather and climate? Hey, the people in North Carolina have just suffered through record setting cold! That must be evidence of global cooling! Wait, we have also been blessed with a record setting dearth of high energy Atlantic hurricanes (no category 3 or higher storms have made landful in the US for an interval that actually has a chance of doubling the previous record, if it lasts another year or two). That proves what, exactly? That the weather is highly variable? That the (wait for it) climate is changing? What part of non-stationary process is so very difficult to understand? The climate is always changing. Look at the graphs I linked up above. Intervals of climate stability on a geological time scale are rare; the climate is usually changing. That’s normal. The Pliestocene has been an entire, continuing ice age in which the climate has been rapidly changing, cycling between extremes where the warmest of temperatures are only a degree or two warmer than the present and sometimes last at those “fevered” conditions only for a century or two where the “fever” in question is hypothermia compared to the bulk of the geological record. None of this is really a matter for much discussion. It is built right into the openly published, reasonably accepted graphs. There is, quite literally, nothing unusual about the present climate that we can detect from the data. In order to even hypothesize that the present climate is abnormal, we would have to be able to predict the normal (unforced) climate for the present. This, we cannot do, either for the present or via hindcast the past. We are cosmically clueless about how to predict the climate. The computational problem is mind-bogglingly difficult, and it isn’t even vaguely surprising that the best GCMs we can build so far are all failing, for the dual reason that they don’t have the spatial and temporal resolution that is almost certainly required to do a halfway decent job and the fact that the models were built and initialized by individuals who sincerely believe that CO_2 is driving the climate with strong feedbacks and that this climate forcing is the dominant factor in predicting the future climate. They fit the models to the last third of the 20th century and the assumption that the warming observed in this interval was dominantly anthropogenic and wonder why they fail. Look without bias at the entire climate record, my friends. Does anyone seriously think that the GCMs can track/predict/hindcast the variations of any significant length in overall climate record using only the values set from fitting e.g. 1970 through 2000 with an assumption of strong feedback CO_2 forcing and negligible natural variation outside of this? Really? Rgb

Comment global warming/climate change/lead poisoning (Score -1) 266

Guys, since the "global temp" rise is not cooperating with the models, and extreme weather is not really doing it for us either, we need some other catastrophe to link to energy mostly used in developed countries. How about lead in sea water? It ain't rising any faster, but it's dirty right? And money is available for specifically FOR AGW causes, right? But wait, what is the base level of Pb in the ocean? Who cares, don't put that shit on the graph. Keywords upside down Tijander, 17 bristlecone pines tree ring data with measured temps tacked on after 1960 for effect, hide the decline, Mike's nature trick, please delete IPCC emails, etc, etc. No reason for me to be skeptical, I mean, I know what happened is Auschwitz, Chelmno, Treblinka, Sobibor, Betzec. But i still am not sure if we landed on the moon.

Comment Epson all in one (Score -1) 381

All the best features: When it THINKS it is out of ink, printer shuts down, cannot even use the scanner! No actual ink measurement capability. Chip on ink cartridge keeps tabs on # pages and cleaning cycles and ESTIMATES when ink should be out, no relation to reality. SUCK IT EPSON.

Slashdot Top Deals

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...