Comment Re:Incorrect. (Score 1) 786
Before making vague bullshit claims about the models, two things:
Apply it to yourself. You have said nothing beyond what I said, except to claim the contrary.
Before making vague bullshit claims about the models, two things:
Apply it to yourself. You have said nothing beyond what I said, except to claim the contrary.
You're being pedantic.
I'm not. It's a garbage word. There are plenty of garbage words that stick around for no reason other than that people like to use them in place of simpler words.
You mention "style".
You used the word. Why did you use it? Because it's simpler and conversational and came naturally to you. People don't generally go around using the word paradigm.
I also used the word "model". Try this: The object-oriented programming model. Gee, does that not get the point across?
And there's nothing vague about simpler words and more common words versus less common words that have come into vogue. Nobody needs a clarification when you use those simpler words. I'm willing to bet at one point you said, "What the fuck's a paradigm?"
I'm not the anon, but yeah, what he said. There are a myriad number of ways to screw up memory in C and C++. But if you want to look like a moron and pretend the vast majority are just null pointers, be my guest.
And yet you instinctively fell into "style" later on, and nothing was lost by using that word. Instead, you gained in clarity of communication. You could also say "model" and it would have the same meaning. "Paradigm" is a fancy buzzword.
By saying the models are out of range, you have already admitted the models are correct, just out of range.
Lol, wut? The point of a model is to be predictive, in this case predictive within a range. If the model isn't predictive, it isn't correct. Holy shit. Did you actually learn any science in school?
It's not you being kind, it's you being a dickhead, because I explicitly acknowledged that they were segfaults "in a very trivial sense".
The point being that when a random segfault occurs in a C/C++ program, it could be anything. If you're lucky, the pointer is null. In Java this is a nuisance issue and usually trivial to track down, a typical example being you forgot to check for null when you pulled something out of a collection or similar.
Yes it is lol
Nice job taking what I said out of context, dickhead. You completely ignored the rest of the explanatory text.
I wasn't bashing Java lol. You have some kind of weird defensiveness issues.
Uh huh, sure you weren't.
Memory leaks are surprisingly common in Java as well, because people don't think. If you put something in a list or queue, you need to have a plan for getting it out.
If the list goes out of scope and nothing else points to the object in the list, it gets collected. Yes, you can leak memory in Java. No, it doesn't happen nearly as often as it does in a language like C or C++, and you aren't constantly spending time worrying about memory allocation issues to do trivial code. That to me is the biggest thing.
(Aside: Not quite sure why, but the use of the term "paradigm" multiple times makes me feel slightly icky for some reason. Probably due to it's misuse in business jargon.)
Probably because there's no reason to use such an awkward word in the first place. In this case, notice how you fall into using "style" instead? Also, the vast majority of time, when people use "paradigm", they could replace it with the much more common and simpler word "model" or another simpler term.
An NullPointerException is not a segfault or memory corruption, except in a very trivial sense you can consider it a segfault. The difference is night and day between the kind of memory corruption and wild pointers you get with C/C++.
As for memory leaks, yeah, that can still happen, but it isn't very common and they are easy to track down with VM tooling.
Now if you want to bash Java for not tackling serious issues, just look at its threading model, basically "threads and locks", and it's very easy to have threads stomping on data that it shouldn't. That's corruption.
Java also didn't do a good job for resource leaks, meaning things besides memory, like connection handles.
The standard right wing response "The evidence isn't in yet."
The evidence is in, and the pause in global warming in outside the range of climate models. But the "science is settled", right?
He and the universities he was a part of have defended not releasing his private correspondence as a matter of principle for academic freedom. A scientist should be primarily judged on their published work.
Oh really? So you think it isn't fair to judge scientists who set out to "hide the decline" in their email (and spare me, I know exactly what was being hidden)? Who ask other scientists to delete email to avoid a freedom of information request on IPCC work?
These emails came out after a leak. I think it is more than fair to judge these scientists on these criteria. That Mann, using work email for a public university, seeks to hide them from scrutiny, isn't inspiring of trust.
I think Mann's problem with Steyn is that he said Mann "molested and tortured data in the service of politicized science," in an obvious allusion to the Jerry Sandusky case and that his science was fraudulent. Those are serious accusations against a scientist that could affect his future career if taken seriously.
Mann's a big fucking hypocrite then: http://climateaudit.org/2015/0...
And nature only speaks through evidence.
And the evidence says that the scientists were wrong. Carbon dioxide levels have kept rising, but temperatures have remained flat. The models are out of range.
Try arguing about evidence rather than your feelings.
The long-term trend is a failure. The models were supposed to predict surface temperatures. A recent claim that it went into the deep oceans instead doesn't validate the models. It's over ten years and the models have fallen outside of any predicted range.
Also, the big threat with global warming was supposed to be due to water vapor feedback, as opposed to just the forced warming from carbon dioxide. That has yet to be proven.
There is a pause. Despite rising carbon dioxide levels, temperatures have plateaued (this "hottest year on record" is insignificantly the hottest one in a statistical sense), and none of the models predicted this, despite claims of the science being "settled".
That the science was settled is complete bullshit anyways, because the true threat of global warming has always been predicated on the supposed feedback of water vapor, which is still an open question.
Today is a good day for information-gathering. Read someone else's mail file.