When you can save 8000 per server then invest it in something else it becomes a different issue. I am not trying to say AMD processors are superior, I am just saying factoring in all costs including power and and the lifespan of the unit, AMD wins a lot of the time. Every computer cluster at every University I have ever had access to used AMD processors (with the exception of some NVidia units), and this was for their CS departments. I suspect part of the issue is its easier to justify power budgets and not as easy to justify 8000 more per server to upper admins. Figuring you could buy 1.5 AMD servers for the price of 1 Intel server you end up with a more cost effective computer as far as total CPU performance and RAM capacity goes. Power consumption is not one of AMD's strong suits, and I remember one of our server admins told me the power bill once for the main cluster, it was sickening. I vaguely remember it being in the hundreds of thousands per year. It saddens me that AMD is in this situation, but I seem to remember a time where Intel was pulling some pretty anti-competitive moves, though AMD should have capitalized on its successes in the past. I seem to remember, at least for the desktop environment, the Athlon XP's had better gaming performance. I suppose thats a small market, however even that was an opportunity that could have been exploited better.