This may change as the x86/x64 compiler writers start paying attention to the "optimize for size" bits of the compiler. For the last few decades all the attention has been paid to speed optimizations. It's possible that some of the advantage of ARM in this regard is due to the historically different focus of their compiler teams.
Five years ago would you have seriously thought they'd stick multiple cores in a phone? 64-bit will happen because phones are rapidly becoming more complex, and also because just like the first multi-core phones it will be a huge marketing advantage, because suddenly all those 32-bit phones will look weak and puny.
By that definition CISC chips have always actually been RISC chips. What did you think all that microcode in those CISC chips really was? RISC was an early win because it exposed the microcode to the compiler for improved optimization opportunities, and the simpler fetch/decode/execute logic made it easy to implement an efficient pipeline scheme which compensated for the increased memory traffic. The weakness of RISC was that by moving the microinstructions out into main memory and exposing them to the compilers they wound up exposing too many details of their implementations, and made themselves vulnerable to changes in the speed differential between CPU and memory. RISC's initial advantage in internal complexity quickly eroded as they added multiple dispatch units, and their pipelines changed from 4 to 6 to 8 stages but the instructions were still claiming they had 4. And as the performance differential between the CPU and main memory increased the functional density of CISC code became a win. Intel noticed that they could fake out more registers than the instruction set explicitly revealed, and could treat near memory off BP and SP as though they were registers, neatly mimicking the register windows of machines like SPARC (this basic trick had been used decades earlier in the TI 990 minicomputer). And Intel noticed that since their micro-instructions were similar to RISC ops, many of the basic optimizations that compilers were doing to improve the outputted code were sufficiently simple peephole-type optimizations that it could be implemented in hardware if this hardware optimizer were able to see the a wide enough window of micro-ops in flight. And since the burgeoning delta between CPU and memory speeds meant greatly increased on-die caches, it turned out that adding all this hardware didn't really change the die size much.
I call BS. You should be able to easily and productively talk to people with an IQ +/- 1SD of your own. At 140 that gives you a range of 125-155 with whom you should be able to hold a relatively interesting conversation, which comes out to about 5% of the general population. As long as you hang around the sorts of places where other smart people frequent, this will be much higher. College towns, business areas with largely college-educated workforce, etc. Hang out at the right pub and half the denizens there will have IQ's above 120. For that matter if you have an IQ of 140 and aren't working daily with lots of people with IQ's in the 120+ range then you need to find another place to work, or another line of work altogether.
I've never known an employer demand details, just the prescriptions demonstrating that you're taking them legally.
Personally I think the new start screen is a huge improvement over the old start menu, the metro apps look gorgeous, and the improvements to the desktop side (speed and usability) are significant. I don't think Win 8 will flop on the desktop.
But I don't think it will succeed on tablets like Microsoft needs it to. It will be a disappointment like WP7. The apps won't materialize like Microsoft needs them to, so customers will be stuck with a gorgeous UI that runs a browser, really basic email client, and MS Office. A handful of quality apps, and a somewhat larger smattering of mediocre-to-crappy apps, just like WP7. WinRT is a really huge breaking change in the APIs, so while you can program apps in VB.NET/C# it's a huge shift to do so. Unless your app is pretty trivial you're looking at rewriting a huge chunk of your code. And if you're gonna have to switch APIs and rewrite then you may as well target the iPad/iPhone, since that's where the market is. Microsoft is claiming that devs should target Metro/WinRT because of the sure-to-be-huge customer base, but that's far from a given. There will be a huge customer base on the desktop sure, but that customer base is already served by the existing application base, there's very little incremental market improvement to be had by going with Metro, unless the Win8 tablets really take off, and unfortunately WinRT makes that a much iffier "if".
If you go scrounging around looking for developer experiences with WinRT you find inspiring stories like this Microsoft MVP who finally figured out *how to read a file*: http://www.sharpgis.net/post/2012/01/12/Reading-and-Writing-text-files-in-Windows-8-Metro.aspx
I think it's telling that 6mo after giving away ~3000 development machines at Build 11, there are something like 100 apps in the app store.
If Metro were simply a newer variant of WPF then I believe Apple would be in for a serious fight for first place in the tablet space. Win 8 is wonderful to use on a tablet. But if the apps don't materialize by the hundreds of thousands in short order, then that just doesn't matter. Microsoft should have learned this lesson with WP7, which is itself much prettier and nicer to use than iOS. That they didn't is somewhat depressing. That they doubled down on their error by making WinRT even more of a breaking change than WP7, and even more aggravating to develop for than Silverlight/WP7 is kind of breathtaking.
No. It's certainly possible for some random problem to be misclassified as NP-complete because of a botched reduction proof. This sort of mistake has happened before, and it's not a big deal for anybody but the poor student. But 3-SAT is special - it was the first NP-complete problem that was discovered; the proofs of the other members of NP-complete depends on 3-SAT being in NP-complete. So if 3-SAT is in P then either all the reduction proofs for all the other NP-complete algorithms are wrong (very unlikely), or P==NP.
Progressives gave me very bad headaches. Once I switched to trufocus (now called superfocus) glasses the headaches went away. Plus I can keep the entire screen in focus (both screens actually), with progressives only a small band was in focus, both the top of the glasses was out of focus and the lower half was out of focus because the screen isn't at the right distance for *either* prescription. I could have gotten another pair of non-progressive mid-range glasses that would have solved the problem when sitting normally at my desk, but then I would have had to switch to a far-prescription lens when I leaned back in my chair... Or a pair of trifocals, which would at least give me a 6" bar of screen in focus. But it all got pretty silly very quickly, and the trufocals seemed like they would make the whole problem go away.
The focusing mechanism sounds aggravating but in practice it just disappears. The hyperfocal distance for the glasses is about four feet, so beyond that you just set it to infinity and forget about it. The focusing mechanism only comes into play when you settle down to code and set your focus, then when you get up to move about you set it back to infinity. Also I've found that in practice I don't really use the full focusing range, there are about five positions on the slider that I actually use. But those three extra focal lengths that you get over bifocals are a major win.
I've been very happy with mine, and will be buying another pair in September (I try to get new glasses every two years). My only knock against them was that I thought the round lens look was pretty ugly, however I've gotten a *lot* of compliments on the look of my superfocus glasses, strangers are always commenting to me on how nice they look - I've been wearing glasses for 30 years and never so much as a peep about my glasses except the platitudes from family, but in the 18 months since I got the superfocals it happens two or three times a month; cashiers, tellers, random people in the mall, etc.
No, that snippet compares AL and DX, leaving AX with 1's everywhere except the first different bit, which will hold a 0.
DX= 00000010 00011001
CWD just puts a 0 in AH
AX= 00000000 00001101
DX= 00000010 00011001
XOR ax,dx computes ax ^= dx
SUB ax,dx computes ax -= dx
The XOR swap algorithm to swap ax and dx is: