Making Linux a microkernel wouldn't help much, except maybe you wouldn't have to recompile the kernel to change the feature set. Besides, microkernels tend to be slower. Due to their architecture some things can't be done directly but rather have to be "communicated" to the right component. The communication channel often become a bottleneck.
I somewhat agree with Linus, the kernel is bloated. Many other things are bloated as well in many distributions. For that reason, on servers I manage I use Slackware (Which is a very slim and customizable OS) with hand-selected packages and custom-compiled kernels. It is obviously much harder to get advanced things done in Slackware but I gain a lot in resources usage and stability.
And this is what I like most about Linux and open source in general. If you don't like it you can customize it. Linux makes a particularly good job at it by letting you decide in great details what you want compiled in and what you don't. I can tell by the time it compiles that my server kernels are incredibly smaller than most generic kernels out there. In addition, Slackware is a god trade-off between usability and simplicity (I could use Gentoo to get exactly what I want but I loose on the usability side). It is very lean yet it can do most of the job out of the box, and for the more advanced things I compile my own custom packages or install from source.
The likes of Qualcomm and NVIDIA didn't spend zillions of dollars developing Snapdragon and Tegra respectively, only to find themselves having to compete with numerous other entrants to the market, all facilitated by their supposed partner ARM. This could be an additional reason for ARM to continually make such a big point about how its targeting Intel.
You don't need uniprocessor, just run one instance per CPU. You may also have workers processes (ex php workers) and poll them. either way you can achieve full CPU usage without using threads.
Threads can be an advantage in multi-processor systems but I haven't seen that many applications implementing them properly. Having too many threads running at the same time wastes more time in context switches, OTOH one process per CPU using epoll in a non-blocking event loop is very simple to do and gives you excellent scalability.
As far as I know, besides primes there's a bunch of random data that gets in key generation. Knowing the primes only make it slightly easier to crack the key.
By default PGP use a known set of primes to generate keys, and so far keys generated by it are still secure.
I believe the probability being halved has something to do with the birthday paradox. It's been a time where I could explain this better; if you wish to find out just search for it on Google... This page seems to have a good explanation too:
http://betterexplained.com/articles/understanding-the-birthday-paradox/
Ok, it's getting a bit off-topic, but up-scaling is the most hilarious feature I ever saw on electronics devices. It doesn't make the picture any better and TVs should support lower resolutions just as well (mine does).
Arguably, even more laughable are up-scaling recorders, as it takes more bits for the same quality, but the quality isn't any better after scaling it up either!
AFAIK BD still have a small penetration and most people are still using standard DVD's (I even recall an article a couple weeks ago about avericans having more HD-DVD players in circulation than BD players!)
Just wait until more people use DB and I'm sure it won't be long before each new BD+ gets cracked promptly...
The author of this article seems totally clueless about what "beaming broadband via satellite" means, as it has absolutely nothing to to with 3G or anything cellular related. Cell phones require widespread wireless installation to cover a given area and just couldn't be done remotely!
Broadband internet can work via satellite using a dish antenna, just like with any satellite TV. It has a high latency (~500 ms in each direction AFAIK, so if you're not using terrestrial lines for outbound traffic that means ~1000 ms) but could definitely be used for that purpose.
Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"