Well, if it's goal was improved reliability or making the sysadmin's life easier it missed by quite a bit. If it's goal was something else, then it's moving in a direction other than the reason I wanted Debian for.
Red Hat uses upstart. It's as nasty if not nastier than systemd.
I think of it like passing programs between processes like bash now does in the environment variables.
DO. NOT. WANT.
I picked my two: reliable and simple. That's why I picked Debian. If my priority was "fast" I'd have picked Gentoo and suffered.
See init get complicated in the name of a faster boot gives me heartburn.
Thing is, I've been using it to build sheds and I'd like to keep using it to build sheds. Don't insist I use bridge-building techniques to build a shed.
Help me out here. My search for "Realclimate model data comparisons" doesn't include anything labelled as being from GISS model E.
You know what I'm looking for. Items 3, 4 and 5. I want to read something that's on point. Essentially, a "control" prediction that excludes human causes, an "experimental" prediction that includes human causes and a comparison of the two predictions against measurements in which the "experimental" prediction is within the measurement error and the "control" prediction is not.
Beat me to it. 535 more and they'd have a full 16 bits.
Emotionally charged labels tend to obstruct honest, factual debate.
Also I heard a neat saying once: "There are three kinds of mistruth: lies, damn lies and statistics." Statistics is an incredible valuable tool in the arsenal of science, but it's also one of the most commonly misused tools.
Here, let me ask you an honest question. Give me a name or a link to a climate change model which meets the following criteria:
1. The model was created at least 10 years ago.
2. The model can be fed data about suspected human and non-human causes for global warming.
3. When fed such data for the last 10 years twice, once including suspected human causes and once excluding them, it makes two predictions for world conditions today.
4. The difference between those two predictions is statistically significant versus measurement error.
5. World conditions today are consistent with the prediction made when including both suspected human and non-human causes for global warning and are not consistent with the prediction that excluded human causes.
I'm a skeptic. Not a denier, a skeptic. When I see a model that exhibits solid predictive value year over year, I'll be a believer. Until then, what I see is a lot of scientists taking sloppy shortcuts and then trying to cover the gap with dirty politics.
I know science. And I know politics. And the BS in TFA is pure politics.
This will backfire. The idiots driving this would associate dissent on climate change predictions with folks who reject the historical fact of the Holocaust, the only other place where the term "deniers" is routinely used.
You can't have a brain in your head and seriously think that the modern climate change predictions have a comparable level of certainty to the historical fact of the holocaust. This sort of gross overreach is obvious even to mere mortals who can't readily follow the scientific arguments for or against global warming. It makes the speaker, and every other claim he makes, suspect.
The media has done climate change scientists a great favor by labeling the folks who still challenge the predictions as "skeptics." That word carries connotations of government conspiracy and alien abductions. It's a gift.
By comparison, imagine having to sign an agreement with ARIN before you could use the DNS. Not get a domain name of your own... just look up names in the DNS. Crazy!
ARIN expects a service provider in backwoods Africa to sign an agreement legally binding in Virginia before they'll provide the certificates (not keys, certificates) that provider (and every other) needs to validate the origin of a BGP advertised route. It's nuts.
There are still mainframes, for example, that have to _emulate_ C's unsigned modulo arithmetic. Floating point based DSPs have to emulate C's signed integer types.
Ancient mainframes using 1's complement arithmetic, floating point on systems without a FPU and integers on devices that don't do integer math in hardware? Got any more wacky exceptions that prove the rule?
As for compilers instrumenting code to prevent overflows, that's about to rapidly change.
Don't bank on it.
They say the exception proves the rule. That you had to dig all the way to -complex numbers- to find an exception to the C-is-close-to-the-hardware rule kinda proves my point.
So which intruction does a C compiler emit when multiplying two 64-bit numbers on a 32-bit processor?
Turns out to be a trivial add and shift loop.
When adding two _Complex numbers?
I had to look that one up. It actually is core starting in C99. Yikes. Fortunately not something more than a handful of folks use, what with C not being the language of choice for scientific computing.
When instrumenting a pointer dereference to catch a buffet overflow at runtime
Properly behaving C compilers don't automatically add code to detect buffer overflows.
What part of "every basic operation EXCEPT a function call" did you fail to understand?