You're beating that dead horse because you think I was insulting nV when I was not. I was implying that Rocksteady should have included it to begin with. I think AA should be a feature available in all games but that's just my opinion.
To address your point, kudos to nV for helping Rocksteady but ATI did the same thing (added MSAA) to Stalker Clear Sky with GSC Gameworld but ATI did not lock it into just their hardware. You should stop bringing up what you wrote to try and say nV is the only company doing any work because they're not the only ones.
I think you are confused. GSC added MSAA into the game as a patch. However it cost too much FPS and requires Dx10.1. ATI released a hotfix to its driver to ease the hit. Not that ATI is bad, but they did not aid Nvidia on this in anyway as ATI driver don't work on Nvidia's hardware.
To clarity this problem, Dx10 was too new and didn't do what it was suppose to. Now there are 2 parts to Dx10, both the hardware and the software. Upgrading the software is easy (via windows update), upgrading the hardware is the problem as existing Dx10 card may not support the upgrade. This create mass confusion to user as they don't really know what support what. Some Dx10 card support Dx10.1, some Dx10 card don't have a problem to begin with.
There is a similarity between Stalker Clear Sky and Batman AA, which they both use something called deferred shading. This prevents AA in general to work (graphics are is still bad giggly after AA). There are 2 solutions, one requires another pass of graphics, but there are no functions in Dx9 that handles this 2nd pass, that is why UE3 can't support MSAA on Dx9. Now Dx10.1 has a function that handles this, but as I mentioned above, it takes a serious performance hit. The second solution, that both vendors had retrofitted, is FSAA through their control panel, which more or less does the same thing with no extra performance hit, so everything is good.
Now you can do some research on Anti-aliasing to find out more on the technical side of it. There are many different ways to implement AA, but FSAA and MSAA it the norm. However, the actual code that does MSAA may be different, and thus cost performance and stability variation on different hardware setup.
Now ATI's attitude towards Nvidia is no different from ATI's fanboys towards Nvidia's fanboys, everything that they do or say is biased and bad. Wreckage is a good example as every single word s/he said is seen as Nvidia's PR speech. On the disable to Dx10.1 in Assassin's Creed ATI claims that it was done because Nvidia pay Ubisoft to do it as the issue arise is next to none on Nvidia's hardware. Lots of ATI fanboy confirms that it is fine on Nvidia. However, the fact is some Nvidia's hardware crashes and others have light bleed through walls and memory leaks. IF everything Wreckage said or done is biased and wrong, then why do people think that ATI's PR's speech is legit?
So how reliable is the testing done by people who change the vendorID? Will they be biased? I don't know. What I do know is Nvidia spent the time on implementing MSAA through games that are developed through UE3, and appears to have further optimized it in newer games like Borderland, which beat the crap out of AMD hardware when AA is enabled. It may have something to do with driver optimization and Nvidia have done it through the experience of intoducing MSAA to Batman AA that ATI didn't have a chance to. 285 beating 5870 by more than 30% is just not right. If ATI fail to fix the problem when the game releases, then there will be another round of red/green war about Nvidia's Sabotages.
Edit: Suppose Nvidia is that bad and is trying to block something from ATI's hardware to use it, do you really think that the hack will be as easy as changing vendor ID? PhysX is something that Nvidia don't want ATI user to use, and thus disabling it through its driver. It won't work even if you have a Nvidia card on the system. Won't this method be more effective?