To be fair, that doesn't counter his argument, amd64 has more registers than i386 and they do make a big difference. Repeat the tests with 32-bit pointers and 64 bit registers and then get back to us.
As of today, the method he mentions would probably provide a bit better performance, assuming the processor optimizations didn't break when their expectations weren't met.
However, I think it is very short-sighted to miss the fact that about the only thing increasing these days is memory and that apps tend to grab all the address space they can get. By 2050 I can see machines with 1TB ram, but I can't see apps keeping themselves under 0xFFFFFFFF.
Furthermore, thanks to ASLR, which is a feature available now on most OSes, address space fragmentation is a problem today even for programs well under the 4Gb mark. The future is 64:64. 32 bit architectures are already dead, they just haven't realized it yet.
If Neanderthals and Humans could interbreed I'd say we both were subspecies of our common ancestor. I can't see how can they be a subspecies of us when we didn't exist yet. So either rename homo rhodesiensis to homo sapiens, or hss and hn to homo rhodesiensis sapiens and homo rhodesiensis neanderthalensis respectively.
In any case, make it a cheap computer. I don't think the ipad would work for a toddler.
I used the "adult" version of the ZX Spectrum 48k when I was ~18 months old. I even learned enough SINCLAIR BASIC to LOAD ""
I don't really know how sustainable Chinese characters are in Mainland China, especially after Comrade Mao simplified their etymologies out, believing the Western bullshit that they were too hard. In any case, they have been in use for a few thousand years if that means anything.
In Japanese at least, literacy is steadily increasing. Twenty years ago, with 8-bit computers, kanji were appearing to be on their way out. However, as soon as IME and modern OSes appeared people started using more kanji even if they never could have written them by hand. And that means more kanji regular people can read. Recently, the number of kanji considered to be needed for basic literacy was increased to account for that.
Handwriting is suffering(The only real usage cases in modern Japanese society are resumes[=], paperwork[vv], and kanji quizes/exams[^]), but kanji themselves are here to stay.
Nah, the US military uses "military grade encryption" and "one time pads" while the white collar criminals use real algorithms created by some of the best mathematicians and triple-checked by all of the best mathematicians.
Unlike GCC as a compiler, GDB really sucks as a debugger. It is a source player, nice for high level "debugging", but as soon as you try to follow code that cannot be statically extracted from the ELF and DWARF data you are left with a blinking light and a feature-set comparable to the APPLE I MONITOR.
I doubt LLDB will fix the latter, but it'll probably wipe GDB's ass as a source player because Clang already gets a lot more and better errors and warnings than GCC.
Still, supported platforms: Mac OS X x86 and amd64. So it isn't really here yet. No need to worry.
Better yet, just buy a Sharp Netwalker, now also in tablet form, and get Linux on ARM with better specs.
You can then spend your time trying to run Debian or a recent Ubuntu version from the SD card.
Your real-world usage is what exactly? Playing badly designed games?
I want to play badly designed games *while* I am compiling, listening to some music and possibly leaving my browser on with some badly written JavaScript running. I also want my CPU not to melt.
You would need at least a 5GHz CPU to match a current dual-core CPU in this area. The ongoing trend is to have more and more things running and getting updated in real time. An it has been for a long time.
Files getting indexed, illegal files getting downloaded, stupid GUIs getting rendered, music getting played, Interpreted languages getting JIT-compiled
Gamers are still stuck in the microcomputer era. The real world isn't. And there isn't really a choice in the first place, the choice is more cores and a better experience or getting stuck at XGHz and having to pipe liquid Hydrogen into your home.
I think we will see more CPUs with more cores and likely more storage units to avoid resource starvation. More speed is just not possible.
I don't think Loongson is dead just yet.
MIPS64 for you. They even bought a license after doing it the arr way.
Well, what is the efficiency of one of those sails once space garbage has poked a few holes on it?
Without knowing much, I would think that thrust is directly proportional to surface. So minus the hole, it wouldn't be that bad in principle.
However, an impact into a hair-thin layer of tin-foil spells massive damage even for tiny objects. And the larger the sail, the larger the chances it meets some macroscopic particle.
I hope this works. This would be our best chance if not for travel per se, at least to accelerate probes into unknown space up to reasonable speed.
"May your future be limited only by your dreams." -- Christa McAuliffe