Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Wake me when chimpanzees invent smelting (Score 1) 224

it must be modified by unnatural means

"Unnatural means" is extremely ill-defined, and in common understanding it's by unavailable to nonhumans *by definition*, not because of a shortcoming of the animals. You look up antonyms for "natural" and one of the first is "man-made".

Fire is a particularly interesting choice of discriminator, because it is a natural phenomenon that happens all the time. You of course mean a contained fire that was intentionally instigated by chimpanzees.

Regardless, I'm missing the point of this argument -- you said to wake you up when they invented smelting, and then talked about what your yardstick was, but I don't know what it's a yardstick for.

Comment Re:Didn't we already know this? (Score 1) 85

I almost wonder if it was ever knowledge... Consider that the most effective way of spreading religion is to have children and indoctrinate them into the same religion.

You can imagine 10 different sects popping up with different versions of the dietary rules. The ones that happened to align with health and reduced death would have an evolutionary advantage, and ultimately become dominant.

Comment Re:Illusion? Solo gaming was never fun! (Score 1) 292

No, I don't think he did, unless you're implying that there was an element of parody in the original, which I don't think existed.

He made the following statements:

Solo gaming was never fun

Counterevidence is available in the participants of this thread.

single player side scrollers were more fun when your friends were there to watch you play and wait for their turn to play.

No, the only reason I dealt with that is because I didn't own all games so that was how I'd get to try different games. Side-scrollers were most fun when I got to play all the time instead of being a non-participant.

I will say however that Sonic the Hedgehog 2 was brilliant in having asymmetric co-op (one player was "Tails" who could help out throughout most of the game, but there was no consequences from Tails' death and there were only a couple points in the game where Tails could actually cause a problem). This let me play with my little cousins, who themselves were way too young to have a prayer at completing Sonic, but could credibly experience "beating the game" with me. In this sense, multiplayer was an interesting crutch.

There aren't many games where you play solo and get enjoyment

Wrong. Although the article is saying there are fewer new ones with time...

you want human interaction for increased enjoyability

Not me.

Who the hell plays madden football solo?

Under no circumstances would I ever play madden football.

If you do play solo, do you trash talk yourself?

This statement is vacant. The point of games is not as a vehicle for trash-talk. I can trash talk without a video game and I don't constantly trash-talk. I mostly do that when playing cards with family, because I've basically mastered all the ones they play to the point that it's algorithmic and therefore boring and I need to do something.

I very much associate trash-talk with boredom.

even at the arcade there was a second joystick for the second player (ie. Street Fighter, Mortal Kombat, etc.)

So? You can find books with pictures (eg. comic books), and some people prefer them; does that mean that novels are no fun? The presence of a multiplayer game -- even a particular game that is inherently better in multiplayer than solo -- does not imply that single player isn't fun.

Comment Re:The UK Cobol Climate Is Very Different (Score 2) 270

I have been to many professional workplaces. Almost none of them have an expectation of a suit -- golf shirt and non-jean slacks are about as formal as you get (for men), and you'll very likely get away with jeans.

Law firms, you'd need a suit if you're going to court, but don't need to wear it daily. Some finance/banking professionals, some government jobs. Some salespeople, depending on who they are selling too (with the proviso that they typically shouldn't wear anything their prospective customer would never wear). I guess comedians wear them.

It makes me wonder where you live. I've lived in different cities in Canada and more recently in the US. I know that older tv shows depict a lot more suits than modern TV and thus I can believe there are parts of the world that adhere to the older standard more often.

Myself, I honestly think that an expectation of formal attire is wrong, whether or not it's common. I would say that clothing requirements should be whatever is sufficient to make the other people you work with comfortable, within reason (and customers, if you deal with customers) -- so maybe we can exclude the guy coming to work in full bondage gear or ass-crack showing thong or a shirt covered in racist statements. This was basically the dress code I had in school since I was 5 and always seemed reasonable to me.

Another way of putting it: if it's not strange that I don't wear a suit on the weekend or on vacation, then why is it strange that I don't want to wear one at work?

Comment Re: The UK Cobol Climate Is Very Different (Score 1) 270

Don't like it? Fine. You're not indispensable, there is a long queue of people desperate for a job. You need my money more than I need your "skills".

This is the attitude of a person who will only hire and only retain the very worst of employees who cannot find a job elsewhere. Congratulations on employing the bottom of the barrel.

Comment Re:Fallacy (Score 2) 937

Let's just link it: http://www.npr.org/blogs/13.7/...

Ultimately I think the article writer needs to define what he even means by science. Saying that you reject the idea that science is logical is like saying you reject the idea that scissors are logical. It implies that he's using a synecdoche and expecting everybody to follow. Maybe you can reject the idea that scissors are logical choices of weapons to equip on Roman soldiers. Similarly, he probably rejects the logic of science...something...I'm not going to speculate here. There must be a name for the rhetorical device used here; I'll call it strawman-baiting where he invites us to figure out what he means so that, if we make a good point, he can dodge and say that isn't what was meant because he never actually said what he meant. He may not be using it consciously or maliciously, it's just a common thing people do when they like being right.

Comment Re:I am shocked, SHOCKED, to find gambling here... (Score 1) 462

I don't see a contradiction between what you're saying and what the parent is saying. It's entirely plausible that the Germans knew of the holocaust (even if in a deep state of denial later), and that their rise to power happened, in part, via crazy propaganda tricks.

Comment Re:So what exactly is the market here. (Score 2) 730

The watch doesn't seem very feminine, but it makes me think about how women's clothing often has nonfunctional pockets, so phones are stashed in purses where they are considerably less convenient.

I also immediately think of situations where I'm phoneless, such as when I'm swimming, or I'm carrying stuff (easy to turn wrist, hard to dig out of pocket), or I'm just wearing something that doesn't have pockets even though I'm not a woman, or even while I'm using the phone for something else like talking with somebody. There's also the fact that my watch battery goes out after years and my phone does not last so long, but I doubt the smartwatch can keep up.

We also have to recognize that wristwatches displaced pocketwatches, so it seems like the wrist form factor was generally considered to be an advantageous by much of the population, not matter how disdainful you are of virtually everyone over decades.

Doesn't mean I necessarily think watches have to be smart. I'd put myself in the "unconvinced" category.

Comment Re:Automated test in is a minimum (Score 1) 152

No one is calling free/delete before exit() on all still living objects to show that he has no memory leak.

a) Yes, there are people who absolutely are doing that, particularly if they have nontrivial destructors (whether or not that's a good idea on a heap-allocated object is a different discussion).
b) You don't have to call free/delete/whatever before exit to show no memory leak. You just have to find that there's no *unexpected* allocations. A leak can reasonably be defined as memory that is unexpectedly still allocated at exit-time.

Just an excercise 'I know where all my pointers are'?

Yes. Because if you don't know where your pointers are, that means you leaked them, possibly days ago.

If you have a unit test that 'finds a bug' but the acceptant tests did not, your acceptence test was: wrong, or did at least not cover the relevant line :)

An acceptance test that is wrong or had insufficient coverage is a bug, IMO.

A bug a week really doesn't really shock me, even for a well-tested product, depending on the scale of the product of course.

Comment Re:Automated test in is a minimum (Score 2) 152

No it is not :) how should that work?

The memory allocator can keep track of all the memory that is allocated (maybe you do this with a special build, maybe your default allocator does this).

Provided your heap isn't actually corrupt, (which tooling can also help detect with things like canary words and on special builds with extreme one-page-per-allocation policies, but is not totally detectable), your allocator can have a record of every allocation that does not have a corresponding free by the time of exit, no matter whether it's because of reference cycling or just a missing call to delete or what-have-you. It can even tag it with the allocation callstack and what-have-you.

It's also true that it won't detect obscene memory usage that isn't actually leaked, eg. suppose Firefox leaked on every tab close, but it managed to clean itself up after the whole window closed. That's not strictly a leak but it's just as bad in practice.

It's essentially like having a garbage collector that isn't there to collect garbage, but is there to identify garbage that has been left uncollected.

you programmed the whole application agnostic of memory allocation and then you want to travel all objects and set all references to null (and somehow trigger the GC) to have everything 'cleared'? (Sure, I can write a reflection based graph traversal that sets every reference it encounters to null. But what would be the point?)

No, I don't want it cleared, I want it detected so we can notify the owner of the leaking component that it was not cleared at the proper time.

We aren't detecting leaked memory in order to free it. We're detecting leaked memory in order to fix the bug where it leaked in the first place, possibly hours or days ago, and has been consuming your RAM for no good reason since that time.

Slashdot Top Deals

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde

Working...