Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Comment Re:Ooh Oopsie (Score 1) 518

The xkcd author should stop putting his words in the mouth of a dead scientist in an attempt to give them more weight.

Without rigour, you can easily make experiments that show that homeopathy, water divining, ESP and perpetual motion machines are valid. As is most likely the case in the example of this article. An experiment without rigour is no more scientific than an anecdote.

Comment Re:Is FORTRAN still winning? Was Re:Poor Alan Kay (Score 1) 200

Maybe according to the strict F95 standard it's not allowed, but I've done it. I think it was officially introduced in F2003 (or some TR that I forget) but compilers supported it even before then. Similarly for allocatable dummy arguments to subroutines.

Not to say there aren't some weird quirks with Fortran arrays, like how if you pass an allocatable array to a subroutine where the dummy argument is not allocatable then it *must* be allocated, even if the subroutine isn't going to touch it.

Comment Re:Is FORTRAN still winning? Was Re:Poor Alan Kay (Score 1) 200

It's a hardware thing -- the memory bus and memory read/write speeds are still a limiting factors, particularly as CPU cores get faster and more efficient.

Oh yes, I've seen plenty of code that's limited by memory bandwidth. But I don't think that's what's going on here - simply deallocating and reallocating shouldn't actually touch all of the memory in question, should it?

Fortunately for that kind of code, avoiding such reallocations isn't difficult.

Comment Re:Is FORTRAN still winning? Was Re:Poor Alan Kay (Score 2) 200

Does it still win with dynamic memory allocation? How granular is the dynamic memory allocation? Complete like C?

Fortran's dynamic memory allocation is much easier to work with than C's. You simply declare a variable allocatable, then allocate as needed with the appropriate size. It automatically gets deallocated when it falls out of scope, so no memory leaks (at least since F95).


real, allocatable :: myarray(:)

allocate (myarray(1000), stat=ierr)
(something to check error code ierr here)

I've written a bit of finite difference code in Fortran. Repeatedly allocating and deallocating can give a huge performance hit, so I tend to do all my allocations before the main loop. Not entirely sure why the penalty is so big, but it seems to be - these are allocations of hundreds of MB or even a few GB, so the cost of operations done on the arrays should dwarf the cost of the allocation. Unless there's some underlying reason why touching newly allocated memory is so slow, but I don't know enough about how virtual memory behaves to say.

Comment Re:The future of the internet, really (Score 1) 159

If end user hardware doesn't support it or isn't configured properly, then they will be completely unaware of and unaffected by its existence.

End user hardware generally does support it though - any vaguely modern computer, smartphone or tablet should automatically pick up and use an IPv6 address if available. So if the ISPs start supplying v6 it's essential that it works reliably, because the users' devices will try and use it. Broken v6 does affect connectivity, even if v4 still works fine. And even if the fault is with the users own equipment, you can bet they'll be complaining to the ISP.

Second post because I realised my first one doesn't directly address your point above.

Comment Re:The future of the internet, really (Score 1) 159

That should be true in theory, but the IPv6 hardware & software is nowhere near as well tested as the IPv4 equivalent, both in terms of home equipment and in the ISPs own networks. How often does this kind of thing work perfectly first time? And the staff don't have the same experience with it to fix problems when they do occur. Anything new is a risk, and since hardly any home customers are demanding IPv6 it might seem like it's a risk not worth taking until made absolutely necessary by v4 exhaustion.

That's not what *I* want, but from an ISP's perspective I can see how it would make sense to prepare & test their network for v6 steadily, slowly and thoroughly but not actually deploy it while they still have enough v4 addresses.

Comment Re:The future of the internet, really (Score 1) 159

I assume that's for the US, which seems ahead of the game despite having plenty of v4 addresses.

Here in the UK, none of the major ISPs have deployed v6 at all, and I don't think any of the mobile companies have either. I suppose they're just risk averse, as dealing with support calls for unexpected problems isn't cheap and their margins aren't huge.

Comment Re:Quebec Language Police (Score 1) 578

So you could argue: they are neither english nor french as both languages adopted them from the same source. But that is incorrect in so far as english indeed adopted the words via the french invaders and not via the latin/roman invaders.

Those words in the original post were adopted into English long after Anglo-Norman was dead, so invasion can't be the answer here. They aren't native French words either - both English and French for some reason seem to like to coin new words from the classical languages. I suppose they thought telephone and television sounded grander than farspeaker and farseer (though we do have loudspeaker, oddly). This tendency seems to have greatly reduced recently though - computing terms are generally made from words already in English rather than new borrowings.

I don't think the original poster claimed that French had borrowed them from English, just that they are not native French.

Comment Re:my rant... (Score 1) 323

Yes that's right, but the Norman words are so well integrated now that most English speakers wouldn't recognise them as foreign. The German ones still tend to look German. I doubt too many people would see this post and know that recognise, foreign, tend, recent, doubt, people and post are loanwords. "Integrate" still has a foreign feel about it though.

FORTRAN is not a flower but a weed -- it is hardy, occasionally blooms, and grows in every computer. -- A.J. Perlis