Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:why? (Score 1) 182

Add to that, about 10-20% of the population get motion sick using the kind of VR in Oculus Rift (myself included - I can use it for 2-5 minutes, depending on the mode). It's ludicrous to imagine building a school that would exclude 20% of the potential pupils on some random criterion. You might as well make schools that didn't let in gingers...

Comment Re:intel atom systems keep 32 bit systems around (Score 1) 129

Apple already ships 64-bit ARM chips and a lot of other vendors are racing to do so. The Android manufacturers that I've spoken want 64-bit for the same reason that they want 8-core: It's a marketing checkbox and they don't want to be shipping a 32-bit handset when their competitor is marketing 64-bit as a must-have feature. ART is in the top 10 worst-written pieces of code I've had to deal with and is full of casts from pointers to int32_t (not even a typedef, let alone intptr_t), but it should get a 64-bit port soon.

Comment Re:The ones I witnessed... (Score 1) 129

64-bit is here for a while. A lot of modern '64-bit' CPUs only support 40-bit physical addresses, so are limited to 'only' 128GB of RAM. Most support 48-bit virtual addresses (the top bit is sign extended, so all 1 or all 0 depending on whether you've got a kernel or userspace address), limiting you to 'only' 32TB of virtual addresses. If RAM sizes continue to double once every year, then it takes another year to use each bit. We currently have some machines with 256GB of RAM, so are using 41 bits. 64 bits will last another 23 years. RAM increases have slowed a bit recently though. 10 years ago, you always wanted as much RAM as possible because you were probably swapping whatever you were doing. Now, most computers are happy with 2GB for programs and the rest for buffer cache. As SSDs get faster, there's less need for caching, but there might be more need for address space as people want to be able to memory map all the files that they access...

Comment Re:It's not Google's fault. It's Mozilla's. (Score 1) 129

The real problem for Firefox is not the interface changes that people like you whine about, it's mobile. Now 30% of traffic is mobile and Firefox doesn't have an app for any Apple mobile devices and is effectively excluded from Android by Google's Microsoft-like illegal anti-competitive licensing deals with manufacturers (you can get the app, but it's not preloaded and only a few geeks ever would).

Huh? It's in the Google Play Market and is no harder to install than any other app. Once it's installed, the first time you click on a link from another app you're asked to choose the app that will handle links. I fall into the geek category (and so installed it from F-Droid, not Google Play), but found it trivial to switch to Firefox on the mobile. I mostly did because Chrome has spectacularly bad cookie management and I'd been trying to find a browser that did it better. Early Firefox ports were as bad, but now it's quite nice and with the Self Destructing Cookies add-on does exactly what I want.

The mobile is actually the only place I use Firefox...

Comment Re: No, no. Let's not go there. Please. (Score 1) 937

I recall reading some years ago that there are two kinds of atheists:
  • Those that disbelieve all religions.
  • Those that disbelieve all except one religion.

For some reason, people in the second category describe themselves as 'religious'. And yet you'll be hard-pressed to find, for example, a Christian who requires the same standards of evidence for the non-existence of the Norse, Egyptian, Greek or Hindu gods as he requires that an atheist from the first category provides for the non-existence of the Abrahamic god.

Comment Re:Great idea! Let's alienate Science even more! (Score 1) 937

Well, yes, they do. Just look at the popularity of various autocratic authoritarian political parties, from the US Republicans down to the Golden Dawn in Greece.

Certainly, in well-functioning democracies they do not form enough of a majority to actually overthrow the rule of law on its own, but the fact that they can poll up to 40% of the populace easily demonstrates that there is a streak of authoritarian followers in every population.

Comment Re:Doesn't surprise me (Score 1) 348

As I already pointed out, Bruno was not burned for his scientific views, but his religious ones. And as it turns out, so was d'Ascoli.

The church, both the Catholic and the various Protestant ones, has done enough damage without needing to invent more. So far I haven't seen proof of scientists being burned for their scientific views. You'll have to do better than this.

Comment Re:Doesn't surprise me (Score 1) 348

You realize of course that there was a time about 500 years back, when scientists were actually burned at the stake for having the wrong theory?

Got any examples? The closest thing I can think of is Galileo, and he got in hot water more for playing politics the wrong way, not for his scientific insights per se. And all he got was house arrest in a luxurious villa.

And no, don't mention Giordano Bruno. He was not burned for adhering to Copernicanism, but for his religious views.

Comment Re:Shortest version (Score 1) 326

Talking about open-source businesses is missing the point entirely. Most businesses that are successful as a result of open source (or Free Software, for the RMS-style folks) or that contribute significantly to open source are not 'open-source businesses' any more than companies that use Windows and Office are 'closed-source businesses. The difference is that one category of businesses realises that writing software is expensive and copying software is trivial, so spends its investment on the software parts of its infrastructure paying people to write software (typically customising and improving existing projects), whereas the other pays someone for copies of software and hopes that that will give them an incentive to produce software that's more like they want.

Comment Re: Why do you participate? (Score 1) 226

I mostly agree, but I must say that the writing is uneven enough that the show does not get a complete pass from me. There are too many episodes that just go for the lazy stereotype joke for that.

On the other hand, the episodes that take the characters seriously often have some fine comedy moments.

So, flawed? Yes. Nerd blackface? Not quite, even though it treads dangerously close to that line too often.

Comment Re:Amiga (Score 1) 169

You're comparing apples and oranges as far as the technical details. I'm saying Win 3.x let me continue when it saw problems, and NT could also do that.

Not really. The kind of situations where Windows 3.x let you try to continue, Windows NT just handles transparently. In Windows 3.x, with cooperative multitasking, a single application can refuse to relinquish the CPU. If this happens, you have three choices (outlined by the dialog box):

  • Just wait and see if it eventually recovers.
  • Kill that application and hope that it isn't holding any handles that other processes need to be able to do useful work.
  • Restart the entire computer.

In a system with protected memory and preemptive multitasking, an application that refuses to relinquish the CPU will just have its priority downgraded and the only thing that you'll notice is the CPU getting warm. Eventually, you may choose to kill the program, but it never affects system stability.

I'd like to have the *option* to continue to save my work even if there was a chance of data corruption. For example, take the common NT blue screen IRQL_NOT_LESS_OR_EQUAL. That fact that my buggy network driver tried to access paged memory in the wrong sequence is miles away from catastrophic. And it certainly doesn't take priority over something I've been working on for hours. IRQ 0 is me, motherfuckers!

It means that there's a high probability that something has damaged some kernel data structures. If you continue, there's a good chance that this corruption will spread to the buffer cache and you'll end up writing invalid data to disk. If you kill the system, the corruption is limited to the RAM.

Slashdot Top Deals

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...