Journal Journal: Announcing the release of my new book 22
This feels like a mega-spam entry, and I'm very self conscious about posting it, but I'm excited about this and I wanted to share . .
This feels like a mega-spam entry, and I'm very self conscious about posting it, but I'm excited about this and I wanted to share . .
The Scientist reports that UK group Sense About Science is confronting advertisers about pseudoscientific claims in health products such as "Aerobic Oxygen," "Salt Lamps," and "Activ8." They called the advertisers' customer service numbers and grilled the unfortunates on the other end of the phone about their misuse of scientific language to sell products. The project,
At the same conference, Gelsinger also talked about the 45nm Nehalem core, Penryn's successor. Among the disclosures came the fact that Nehalem will sport an on-die memory controller, as well as integrated graphics processor.Right now, a lot of folks who're testing out VT have been disappointed that its performance isn't much better than existing, non-VT-based virtualization solutions like VMware. Specifically, VMware products use a binary translation engine that ingests regular x86 OS code and produces a "safe" subset; VMware claims that this binary translation approach is as fast as, or faster, than VT-based approaches because the OS doesn't have to do costly VM transitions in order to execute privileged instructions. (These claims are debated; I'm merely reporting the fact that they are made.)
A major decrease in VM transition times will help the performance of VT-based solutions like Xen, and it would make the "which virtualization package to use?" debate even more about managment and less about relative performance than it already is.
Reading between the lines on this comment and others, I can say with a pretty high degree of certainty Intel will almost certainly be using its packaging skills to put a GPU in the same package as a Nehalem CPU. Furthermore, this is going to help out with mobile products, small-form-factor devices (*cough* Apple), and anywhere else that power and cooling are more critical than raw performance. I'd expect that such CPU/GPU devices will cut down on the number of on-die cores that you can put on the CPU die (for power dissipation reasons).
I am absolutely stunned that Slashdot's editors would give credibility to a completely false story, pushed by a paid industry PR professional. As Rugrat said,
For the last year or so, I've been happily using Debian, with a mixture of sources so I was stable, but current, just like nearly everyone who uses Debian.
Then I tried to upgrade or something insane like that, using aptitude, and the whole thing went tits up on me. No amount of cussing, kicking things, or actual tinkering with the software could save my machine.
"If you want to know what happens to you when you die, go look at some dead stuff." -- Dave Enyeart