Forgot your password?
typodupeerror

Comment: Re:Astroturfing for Hillary Clinton (Score 1) 1134

by Asmodae (#47830049) Attached to: Combating Recent, Ugly Incidents of Misogyny In Gamer Culture

You're clearly arguing with someone not me.

The post I was responding to said:

the media would have you believe there are millions of people out there raping women and whatnot...

I said:

I think it takes millions of rapists (mostly men natch) to reach that number.

You said:

That's still seven or eight million men in absolute terms, of course

Everything else you said was arguing with statements nobody made. The definition of a straw-man argument. There really are millions of people (eight or nine to use your number) committing these crimes. This is staggering to me. All I was trying to show was that the decimal place in the OPs thought may have been misplaced. Nobody said tens of millions or hundreds of millions, or anything resembling ALL. The name 'millions' is absolutely the correct term to use, which was my one point. As for the rest of your post, I'm not sure who you're talking to.

Comment: Re:One bad apple spoils the barrel (Score 1) 1134

by Asmodae (#47829649) Attached to: Combating Recent, Ugly Incidents of Misogyny In Gamer Culture

I think you're conflating two points. From what I've seen, misogyny in games is described where a female character is portrayed with extremely negative female stereotypes.

I think this is a vastly different issue than commenting on the lack of representation in games. Those comments are saying 50% of the population is female, 50% of gamers are female, but only a few percent of characters in games are female. The gender of a specific character is irrelevant (and not typically discussed in that context), but in the aggregate there appears to be something out of place. Commenting on that dramatic difference is not calling developers misogynistic. People wanting that difference reduced to a fraction closer to something representing the actual people playing the games is also not calling developers misogynistic.

Issues of Gender Representation and Misogynistic representation are both gender related, and often discussed together. But you can't lump them into the same thing as they have very different consequences. You could have 100% of games with female leads but still be a horrible representation of women, and you could have no games with female leads and have no outright misogynistic representations. However, I think the point people are trying to make about gender representation is this: "An overly imbalanced representation of gender in the aggregate (towards either gender) has consequences and implications which are at best neutral and at worst quite negative. Given this, can we make things better?"

Comment: Re:Astroturfing for Hillary Clinton (Score 1) 1134

by Asmodae (#47828183) Attached to: Combating Recent, Ugly Incidents of Misogyny In Gamer Culture

Unless it happens to be a PERSON where the death threat included home address information. Regardless of gender, THAT's the line that makes a threat credible.

Tangentially, the behavior you describe is the reason I don't play those games. It's also a behavior that doesn't need to be there. Why defend it as if its necessary for some reason?

Comment: Re:Astroturfing for Hillary Clinton (Score 2) 1134

by Asmodae (#47827309) Attached to: Combating Recent, Ugly Incidents of Misogyny In Gamer Culture

This is not limited to gaming. Here's a breakdown of the statistics as they are known right now. Based on this: Wiki-link rape statistics 1/6 women have been raped in the US. Out of 150 million women in the US, that is around 25 million women estimated which have been raped. I think it takes millions of rapists (mostly men natch) to reach that number. So YES, millions of people (mostly men) ARE in fact out there raping people. No media bias needed, just knowing some real numbers.

Comment: Re:Dominion & Munchkin (Score 1) 382

by Asmodae (#47779363) Attached to: Ask Slashdot: What Are the Best Games To Have In Your Collection?

Want a fun game? Try one where a 5 year old might beat you with a random turn of a card and absolutely no strategy, instead of one in which you can feel good about yourself by constantly beating a 5 year old.

Is that even really a game, by definition?

That's like two people roll a dice, higher roll wins. There's nothing to play, no input or decisions on the part of the player, and precious little interaction between players. I don't think that would be very fun at all.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45912409) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
To be fair, the definition of "well" I intend isn't an arbitrary X/Y value. There's already very well defined numbers for the hardware which currently runs the algorithm. To transfer "Well" to custom hardware would be somewhere in the vicinity of: less than the original general purpose CPU by enough that it justifies the design effort involved and doesn't cost MORE to manufacture. All engineering decisions are trade-offs, and if the trade-off isn't worth the effort and resource cost you don't do it. For a transfer effort to go "Well" means at the end of the day you come out ahead somewhere.

If you have to spend 3 million dollars on custom hardware development just to get performance parity with a COTS general purpose CPU... you'd be hard pressed to call that "well" by any measure. This is what is implied by the setup of the original Ask Slashdot question, asking an engineering question about feasibility and cost.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45910513) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
Not really. The biggest conversion issues I deal with (when converting algorithms to hardware) are related to how software treats RAM vs how hardware treats RAM. They are fundamentally different methods of operation. In software RAM is cheap/free, so it is preferred over CPU cycles. In hardware, the processing is cheaper (in general) and RAM is more expensive.

Buffering and holding a megabyte of data between each stage of processing is natural and very easy for software. But in hardware this is a very inefficient way to do things. Converting from one method to the other can be quite difficult depending on the algorithm.

Comment: Re: Verilog (Score 1) 365

by Asmodae (#45903655) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
Nice point - For is used for iteration in software, and For Generate in hardware is used to generate new instantiations. The similarity in words/syntax is a dangerous trap. The closest thing in software is like putting malloc or new() in a loop. It's a great convention for when you need many similar bits of hardware. Completely wrong for iteration.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45903589) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?

The original wording was "Some C algorithms may never transfer well into a hardware implementation." At least in my mind the transfer process is what might not go well... not how the final product may or may not run. Having some experience here I understood the transfer to be where the work/expense would be. And those are ultimately key factors you would use to base your decision about whether or not to go ahead and make the conversion.

I don't think we disagree on content, just on what might have been meant by Andy's post. Especially considering exactly what you said for the reasons you said, making claims of impossibility would indeed be silly.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45903415) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?

Unless the algorithm requires all those special instructions and monster ram to run.... at which point your custom hardware looks very much like the CPU and system it is intended to replace, and definitely not cheaper unless you're selling a whole lot of them. Reliable hardware is expensive to build even when it's a simple design iterating on previously known good hardware. Starting from scratch on raw silicon takes millions of dollars, just for your first chip lot, not to mention all the man hours to get it there and subsequent revisions. There are lots of algorithms that don't make any sense (from a cost vs efficiency standpoint) to port to custom hardware. That's the whole reason the generic CPU exists in the first place.

I guess I'm disagreeing with your definition of better. If it's faster but costs too much for anyone to actually buy isn't better.

The reason that every major university maintains a department of mathematics is that it's cheaper than institutionalizing all those people.

Working...