I think it's really sad how our society basically devalues skilled labour. That's what writing good software is, after all. The attitude of businesses seems to be that people are more or less replaceable and therefore expendable, and people have responded by outrageously increasing their qualifications. This costs society a lot of money in wasted time, lost productivity, lost income, and stunted career progression. The quality of education has also deteriorated under the extremely high demand. This is inflation in education: the amount of it goes up as its value drops.
It does not make sense for most software developers to have a four year computer science degree. It's hard to see what they could need beyond a solid understanding of algorithms and data structures, and exposure to different programming languages. You could learn it in two years, but it would be quite hard. Or you could learn the basics in one year and do a year of apprenticeship and two years as a journeyman to get it all. But it doesn't work that way anymore, because a great many businesses refuse to bear the costs of educating their employees. It's stupid short-term thinking, and they pay for it in other ways, but all of the career risk has been pushed onto the labour force.
So what are you missing? The value of an education is really what you make of it. I guess the best way to explain it is with an analogy. If you were to get an English degree you would study Shakespeare. It may or may not help you write a good play. If you were talented, you might pick up something from Shakespeare. Or you could study Shakespeare with great dedication, and practice writing until your work really compares. Or you could bullshit, plagiarize, and plead your way though a degree and go on to write travesty after travesty to be inflicted on an unsuspecting public. In any case, someone with a BA in English had better know Shakespeare. That's just expected, because it's part of a body of knowledge. It may or may not be related to the skills that employers are looking for.
Universities exist to maintain and expand bodies of knowledge. That's it. To the extent that they have been used as a "shortcut" for employee training or certification, it is highly unfortunate and detrimental to society as a whole. I wouldn't deny the right of an education to anyone, but society has misconstrued its purpose.
For the vast majority of home users this is still the best advice.
If your algorithm is designed to break up the problem to exploit the cache then hyperthreading is a bigger mess. The data for thread 1 and thread 2 (out of 8) might be complementary, but the operating system will run those threads different actual cores, because all it sees is the virtual cores. This can be very inefficient if you need the whole cache.
Perhaps worst of all, you are stuck always running 8 threads. 2-6 threads may not be distributed evenly across the real cores, leading to inconsistent performance. Therefore, you may lose performance by attempting to scale the problem further than it is efficient to do so. With real cores, I can decide (based on problem size) the correct number of core to use.
In conclusion, hyperthreading has its uses, but operating systems are oblivious to it and that's a major problem with more than one core.
Note: I parallelized my software and the Core i7 is awesome. Superlinear speedup is easy to achieve with a dedicated L2 cache. The Phenom II would also give great performance. So I would bet that Atom and other underpowered cpus are a fad. They will not look very good next to a mobile Core i7 that is 20x faster when all cores are used.
Given the choice between a Core 2 Quad and a Phenom II, you should pick the Phenom. No question about it. The Core 2 quad has a split cache so multithreaded performance is crap. The cores have to transfer data through the slow memory interface, which limits parallel speedup in a lot of cases. This wasn't really an issue when Intel released the processor, but in the near future it will be a serious issue because the parallel software is coming.
The top 10 is heavily dominated by IBM and BlueGene systems, with only Cray holding rank 2 and 3, Dell holding rank 8 and SGI holding rank 10.
The first non-US system is the Barcelona Supercomputing Center, ranked 9. Japan is loosing ground, with their first system ranked 14.
The full list is also available.