Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Bees have a guide (Score 5, Informative) 394

What amazes me though is how they look at another bee and visualize it traveling to a set patch of flowers, by looking at its dance.

Are we discussing bumble bees or honey bees? The summary says bumble bees.

http://www.earthlife.net/insects/socbees.html states that bumble bees "...have not evolved any means of communicating information reguarding utilisable resources."

Comment Re:Kinda silly. (Score 1) 309

You would be shocked at how much software will fail to compile with -Werror.

I've been developing software professionally for almost 30 years -- I'm not shocked at how much software fails to compile with -Werror, or even without -Werror sometimes.

And for extra thrills I occasionally compile the stuff I work on with Intel's compiler, just to see what it finds. And it's been a while since I last checked, I should see what the status is of C++ in CLANG these days.

Comment Re:Kinda silly. (Score 1) 309

There is nothing anywhere that says __TIME_T_TYPE couldn't have been 'long long int' on 32-bit iron.

I don't know why you want to debate the inheritance of ISO C via POSIX, they have nothing to do with the original choice of a 32-bit type for time_t in the 32-bit Linux kernels. In fact, it's quite the opposite: the spec deliberately does not define a size, allowing kernel and libc implementers the choice to make them whatever size they wanted.

(This isn't theoretical -- I once spun a one-off FreeBSD that used 64-bit time_t on 32-bit iron. I could do it precisely because I didn't have thousands of existing binary apps to break. And I could compile correctly written third party apps without change.)

But you can't change it in Linux now, not without breaking the ABI.

Comment Re:Kinda silly. (Score 1) 309

Uh, Linux inherits time_t from POSIX.

Got a citation? No? I didn't think so. Actually POSIX inherits from ISO C [http://en.wikipedia.org/wiki/Unix_time] and---

ISO C (ISO/IEC 9899:TC2, Committee Draft dated May 6, 2005, [because that's the copy I happen to have as a PDF]) in section 7.23.1 Components of time, paragraph 3:

    The types declared are size_t[,] clock_t[,] and time_t which are arithmetic types capable of representing times; ...

Nothing there about them having any particular bit-size, regardless of the native bit size of the underlying hardware.

Comment Re:Kinda silly. (Score 1) 309

ABI nothing. That new OS needed to have software ported to it and a lot of Unix like software expects time_t and int to be interchangeable so changing it would involve fixing a lot of software.

If they expect time_t and int to be interchangeable -- even on 64-bit iron -- then there's still some fixing that's needed.

Comment Re:Kinda silly. (Score 2, Informative) 309

>>>Only those with no imagination---

Were you even alive then - 1976?

Yes, actually I was alive then, and for quite a few years before that.

I was. Remember that was a time when being able to buy a video & watch it at home was an alien concept (pre-VCR).

Not true. I was shooting video on 1" cartridges in my HS film classes in 1976, and believe it or not, there was a movie sale and rental industry then. It was small, by mail order, and expensive, but it did exist.

If you had said to someone, "Someday you'll be able to sit on a bus and watch a video from 10,000 miles away," they'd probably lock you in a loony bin. Or just say, "You're a nutty nerd - let's give you a wedgie."

I think those reactions had more to do with the goofy grin, flood pants, and the bad haircut you had than anything else. :-P

Computers in 1976 were the size of small rooms,

I think you're a little confused about the whats and whens.

I lusted over SWTP 6809s and various Z/80 systems written up in Popular Electronics throughout the 70s -- too expensive for my paper route level of income. Apple 1s were around by '76, and the first Apple ][s shipped in 1977. Circa 1976 HP donated an old mini to the HS I went to -- it was the size of a four drawer filing cabinet. Apart from that, most of those were smaller than a Selectric typewriter.

Yeah, the Burroughs mainframe at my dad's office years earlier filled up the whole room, but actually, if you knew what you were looking at, you knew most of it was tape drives, line printers, and other stuff.

and they were just beginning to be shrunk to PC size, but they were hard-to-use (no keyboards or screens; they used esoteric switches).

Esoteric? Like the switch on the wall that you turn the light on with? Actually you could get a SWTP terminal with a full QUERTY keyboard and a 40×25 CRT to go with your 6809. Apples -- 1 and ][ -- had real keyboards.

Nobody at the time thought common people (read: uneducated boobs) would have computers with self-assigned addresses. Nobody thought there'd be more than one computer per home, much less 2-3 per person. Most envisioned computers as being like Star Trek - a single unit running the whole house. The number of homes was only 900 million, so having ~4000 million addresses was plenty.

The 1970 Census put the US population at 200M. By 1980 it was 226M. I don't know what the typical household was, say family of four. I think that'd make for a lot fewer homes, but really, what does that have to do with anything?

Again, there were people -- with imagination -- who were anticipating the computer revolution. Not unsurprisingly, they were right.
 

Comment Re:Kinda silly. (Score 3, Funny) 309

It was pre-home computer revolution and nobody thought computers would shrink to the size of everybody's pockets (cellphones). Nobody thought we'd be using machines will a billion bits (or more) or memory. Back than ~4000 was considered a lot (it was the hardcoded limit for the Atari console). Everything was smaller in scale, and Mr. Cerf is not to blame for not predicting the invention of the Web Browser (killer app) and how it would reach into every facet of our lives.

Only those with no imagination---

I can say with a great deal of confidence that plenty of us knew what was coming.

Now who do we blame for 32-bit time_t on 32-bit iron? There's a relatively new OS that lots of people use today that didn't have any ABI concerns when it was in its infancy, yet its creator didn't have the vision to see beyond doing pretty much what everyone else had done before him. (And I won't name him because then I'll just get modded a troll. But I bet you can guess who it is.)

Comment Re:Plenty of heads up. (Score 1) 451

At the risk of repeating myself, er, uh, wait.... I'm about to repeat myself

    Apple might have something clever in there...

Is GCD world changing? Would an Oracle JVM that didn't use GCD still work just fine? Got any hard numbers about just how much faster using GCD makes their JVM? Probably not because it's doubtful that Apple would release the non-GCD version into the wild. (But we could guess that they must benchmark both approaches as part of their development process.)

And there's nothing to say that Oracle couldn't build their hypothetical JVM for OS X using GCD too.

Slashdot Top Deals

"Confound these ancestors.... They've stolen our best ideas!" - Ben Jonson

Working...