...you can just make a big opt-out off Flash. Apple has restricted your choice, whether you like this interpretation or not. It's like alcohol: I normally describe myself as a non-drinker, but I'm not a fanatic and I will drink a bit in special occasions (the champagne in my own wedding for example - god it would suck to toast with Diet Coke wouldn't it?). In Apple land, you have the Prohibition. I suppose some Americans (the absolutely radical non-drinker types) were happy with that in the 1920's.
What is this, a very condescending programmer who specializes in faster factorial functions?
Still, like any stable dictatorship, China is able to pursue at least some general, strategic long-term plans. It's the advantage of rulers who don't have to worry that they're leaving the office in a few years, like in democracies where most presidents/governors/mayors won't bother to lay any eggs that will only hatch after their mandate is finished (especially when reelection is not allowed).
As a former Z80 black-belt ninja, I'd say it was easier - except that Z80 didn't have mul/div instructions... but, who needs these? You could write a routine for this (granted: dead slow) but I've written pretty significant programs without ever needing a single "full" mul/div, just hacking around it with a couple shifts, adds or bit ops.
...is this: suppose that we make contact with some alien species, and it just happens that they are overwhelmingly superior to us - a civilization that's thousands of years more advanced and kicks our ass in technology, morals, arts, everything.
Now suppose that this civilization is completely atheist. They are angel-like beings but with no God; they consider religion a trait of pre-historic (in their POV) civilizations. Just like, say, our modern opinion about cannibalism. Their scientists and philosophers have long demolished all pro-religion arguments, finishing this debate in such clear terms that any educated human would understand and find impossible to not agree.
*If* this happens (notice I'm just supposing), religion is in major trouble and the only option for resistance is fanatism.
It was once believed that the sun revolved around the earth. This is still a good approximation, for most purposes here on the ground. It is only when we begin to consider the motion of other planets that it becomes important which is which.
Actually, saying that the Sun orbits Earth is not really wrong even today. The universe doesn't have a fixed reference frame so no body has an absolute position, all 'positions' are just relative to other bodies (including time positions). We still tend to put the Sun in the "center", in coordinate (0,0,0) of the solar system, because that's very useful for local purposes, but it's just as arbitrary as having Earth in the center centuries ago. So, we could very well define as a convention that Earth is permanently fixed in the center of the entire Universe as a convention, and adjust all our calculations for that, and everything would be just fine... some equations could become more complex (a microscopic perturbation in Earth's orbit would translate, due to angular distance, in galaxies billions of years away being "shaked" in faster-than-light speed - still not violating any physics laws) but that would be just complex and ugly, not wrong.
...formed one billion years ago, but originally much more distant from the star. But its orbit was not stable, approaching quickly (in astronomical time) to the star; and we're just lucky to have found it in the final stage of the death spiral. If this is the case, it may even be possible to watch the final spectacle in a timeframe reasonable for human scale (a few thousand years, perhaps centuries, or even less).
Wild speculation of course... but just to be safe, I'm immediately canceling all my plans of space vacations near the Wasp18 system. I never liked wasps anyway.
...as the latest JVMs have a "compressed pointers" option that allows the 64-bit VM to be as memory-efficient as 32-bit. This option limits max heap size to 32Gb, but this should be plenty for even most huge applications, for many years to come.
BTW this optimization is a great advantage of all "managed" languages/VMs; it's just impossible for langs like C/C++. Of course, you've got to actually implement it in a JIT compiler like Java's. But I think even an interpreter could do it. Notice that there is no tradeoff to code speed; although the code needs a couple extra instructions to compress/uncompress pointers at every usage of heap objects, this overhead is greatly reduced/hidden by JIT optimizations. (And for an interpreter, it's certainly just noise in the performance map.) Not to mention that the gains in reduced paging, cache misses and TLB misses will more than compensate for any remaining overheads.
L4 is a microkernel, so it DOES mean that buggy device drivers cannot crash, corrupt, DoS or own the system; because drivers would all run in userland.
Of course, if bugs prevents that some essential driver works at all, the system may becomes unusable anyway. But the kernel is still in control and it can resort to techniques like restarting the driver or, if the driver keeps failing, replacing it by a much simpler, "safe mode" driver so you can use the system even with some reduced functionality or performance.
Who in his right mind starts mixing a binary and a decimal system. 2^10 bytes?
Very likely, a hardware engineer or a programmer who had to deal with low-level system realities like memory block sizes being always powers of two. You must manipulate memory (and other HW things) in blocks which sizes match physical packaging, to avoid inefficiencies from alignment, memory hierarchies, etc.
And this is reality up to current day. Please come back whining for decimal system when you're using, say, a monitor with 1500x1000 resolution, or a net connection of 10.000.000 bps.
I for one, will continue using Kb = 1024 etc., these politically correct labels like KiB are for idiots.
There are two major products that come out of Berkeley: LSD and UNIX. We don't believe this to be a coincidence. -- Jeremy S. Anderson