Comment Re:Avoi9ding to answer (Score 2, Informative) 80
Nvidia PAYS for removal of features that work better on AMD
Reading the link you posted above, it seems like a bit of a non-factual load of waffle. Nvidia deny paying, Ubisoft deny being paid, and the only sources mentioned are anonymous speculators we have no way of knowing are not just a few paid ATI shills.
Nvidia pays for insertion of USELESS features that work faster on their hardware
Wow, another example of amazing journalism here.
Some guy moaning about Crysis having loads of detailing that is only used in the DirectX11 game. He give loads of examples of this, then posts a summary page of wild speculation with no sources quoted other than his own imagination. He never asks any of the companies involved, he just posts a bunch of stuff about why this might be the case.
I have another possible suggestion as to why this was the case: Crytek like making stuff look overly detailed and include graphics detailing that means their games continue to max out graphics cards long after they are released. They always make they games playable on the budget cards if you crank the detailing down, but they also like catering to people who buy a new graphics card then go back and play a few oldies that they had to crank the detail down on previously. Crytek also probably also quite like their games being used in hardware reviews because their games hammer the hardware.
Nvidia cripples their own middleware to disadwantage competitors
Ok, congratulations on actually posting an article that was real journalism, with quote sources and not just made up of the authors own conjecture.
The issue here though seems to be that there was an optimisation, moving from x87 to SSE that they did not do on a bunch of legacy code. Instead they rewrote it from scratch, which took slightly longer to use SSE.
This was not them intentionally doing something to hobble a competitor, this was them not doing anything to help them quickly. That is very different.
They did however ultimately fix it:
"PhysX SDK 3.0 was released in May 2011 and represented a significant rewrite of the SDK, bringing improvements such as more efficient multithreading and a unified code base for all supported platforms"
Intel did the same, but FTC put a stop to it
http://www.osnews.com/story/22...
There is a massive difference here, Intel's were intentionally hobbling the code their complier created based on finding a competing vendor name in the product string. They did not say "wait for version 3" like the PhysX case, they just did something then just sat their tight lipped until it went to court and they were forced to change it.
This is something FTC should weight in just like in Intels case.
As I said earlier, Nvidia made the all important change to use SSE when running PhysX on the CPU without the FTC being involved.