She oscillates between fighting back and playing the victim. All you really need to know about her is that she's a compulsive liar: https://thezoepost.files.wordp...
Well I see Anita's recent efforts as being a lot more expensive than her earlier work.
Going on a professional victim tour to raise more money for herself? Yeah, very expensive.
Oh, and I forgot to mention, it uses the threat-of-rape trope, another of Anita's targets. But yeah, no prostitutes! You really are bad at this.
It isn't gaming culture as a whole, which is 50% female these days.
Same old bullshit. Including in those stats are people who's extent into gaming is buying Angry Birds on their smartphone or playing Farmville. Those women in general don't play shooters, don't buy the next gaming console, don't update their PC to play games, and don't read gaming journals. Anita is critiquing games that are catered to men, because that's their demographic. And she does so in completely unequal fashion, having no qualms when men are portrayed in sexists ways.
She is, of course, just a con artist riding the wave of feminism and video games while reading scripts for McIntosh. That was Jack Thompson's biggest mistake, not getting a female spokesperson.
Of course it doesn't have to be - look at games like The Last of Us.
Uses the zombie trope. Uses the young-girl-in-distress trope. Uses the poor, innocent daughter gunned down trope. Wow, I'm so glad we have you and Anita to save us from all these horrible, gritty, sexist tropes.
If you look at the responses on YouTube very few of them actually critique her arguments. One of the closest is Thunderf00t's
So Thunderf00t's videos went from being "particularly bad" to "one of the closest [that actually critique her argument]?
It's a classic mix of ad-hominem and straw man attack
You should know all about that, as the last time you argued this with me you were full of them: http://slashdot.org/comments.p...
"So far, no one has been able to provide a compelling answer to why climate change seems to be taking a break. We're facing a puzzle. Recent CO2 emissions have actually risen even more steeply than we feared. As a result, according to most climate models, we should have seen temperatures rise by around 0.25 degrees Celsius (0.45 degrees Fahrenheit) over the past 10 years. That hasn't happened. In fact, the increase over the last 15 years was just 0.06 degrees Celsius (0.11 degrees Fahrenheit) -- a value very close to zero. This is a serious scientific problem that the Intergovernmental Panel on Climate Change (IPCC) will have to confront when it presents its next Assessment Report late next year."
The referenced article this quote was taken from:
Another important topic the climate scientist mentions:
"Temperature increases are also very much dependent on clouds, which can both amplify and mitigate the greenhouse effect. For as long as I've been working in this field, for over 30 years, there has unfortunately been very little progress made in the simulation of clouds."
Keep this in mind the next time you hear, "The science is settled."
Before making vague bullshit claims about the models, two things:
Apply it to yourself. You have said nothing beyond what I said, except to claim the contrary.
You're being pedantic.
I'm not. It's a garbage word. There are plenty of garbage words that stick around for no reason other than that people like to use them in place of simpler words.
You mention "style".
You used the word. Why did you use it? Because it's simpler and conversational and came naturally to you. People don't generally go around using the word paradigm.
I also used the word "model". Try this: The object-oriented programming model. Gee, does that not get the point across?
And there's nothing vague about simpler words and more common words versus less common words that have come into vogue. Nobody needs a clarification when you use those simpler words. I'm willing to bet at one point you said, "What the fuck's a paradigm?"
I'm not the anon, but yeah, what he said. There are a myriad number of ways to screw up memory in C and C++. But if you want to look like a moron and pretend the vast majority are just null pointers, be my guest.
And yet you instinctively fell into "style" later on, and nothing was lost by using that word. Instead, you gained in clarity of communication. You could also say "model" and it would have the same meaning. "Paradigm" is a fancy buzzword.
By saying the models are out of range, you have already admitted the models are correct, just out of range.
Lol, wut? The point of a model is to be predictive, in this case predictive within a range. If the model isn't predictive, it isn't correct. Holy shit. Did you actually learn any science in school?
It's not you being kind, it's you being a dickhead, because I explicitly acknowledged that they were segfaults "in a very trivial sense".
The point being that when a random segfault occurs in a C/C++ program, it could be anything. If you're lucky, the pointer is null. In Java this is a nuisance issue and usually trivial to track down, a typical example being you forgot to check for null when you pulled something out of a collection or similar.
Yes it is lol
Nice job taking what I said out of context, dickhead. You completely ignored the rest of the explanatory text.
I wasn't bashing Java lol. You have some kind of weird defensiveness issues.
Uh huh, sure you weren't.
Memory leaks are surprisingly common in Java as well, because people don't think. If you put something in a list or queue, you need to have a plan for getting it out.
If the list goes out of scope and nothing else points to the object in the list, it gets collected. Yes, you can leak memory in Java. No, it doesn't happen nearly as often as it does in a language like C or C++, and you aren't constantly spending time worrying about memory allocation issues to do trivial code. That to me is the biggest thing.
(Aside: Not quite sure why, but the use of the term "paradigm" multiple times makes me feel slightly icky for some reason. Probably due to it's misuse in business jargon.)
Probably because there's no reason to use such an awkward word in the first place. In this case, notice how you fall into using "style" instead? Also, the vast majority of time, when people use "paradigm", they could replace it with the much more common and simpler word "model" or another simpler term.
An NullPointerException is not a segfault or memory corruption, except in a very trivial sense you can consider it a segfault. The difference is night and day between the kind of memory corruption and wild pointers you get with C/C++.
As for memory leaks, yeah, that can still happen, but it isn't very common and they are easy to track down with VM tooling.
Now if you want to bash Java for not tackling serious issues, just look at its threading model, basically "threads and locks", and it's very easy to have threads stomping on data that it shouldn't. That's corruption.
Java also didn't do a good job for resource leaks, meaning things besides memory, like connection handles.