96Firebird
Diamond Member
- Nov 8, 2010
- 5,738
- 334
- 126
http://www.gamemunition.com/rumors/did-nvidia-pay-crytek-to-delay-directx-11-support-for-crysis-2/
I'm still an nvidia sheep that blindly follows them no matter how unethical they are. I think i'll make some posts defending their hilariously anti consumer practices because I own their hardware.
nvidia paid crytek US 2 million to delay their DX11 port, and nvidia implemented it *themselves*
Basically what nvidia did was write code implementing AA. They only allowed this code to run on their hardware as follows:
if(NVIDIA == true)
{
//use new code nvidia created
}
else
{
//use the old cold without AA
}
The code they created would have worked just fine on AMD hardware but they figured they wrote the code so why should AMD users benefit. However, there now seems to be some type of legal agreement that keeps the company from going into the code and developing AA code for AMD GPUs. AMD had been locked out. This is a anti competitive tactic and may even be illegal - not sure. As the company producing a game I would never have sold access to my own game's code to nvidia - bad decision. On the other hand Nvidia is being pretty evil in all this too.
If taking console ports and adding in effects such as physx or DX11 pathways for advanced graphical features consoles can't do is cheating then they're guilty as charged and I hope they continue to be "evil". Because if it wasn't for Nvidia's software engineers and TWIMTBP program, we would still be stuck on DX9 and talking about the one day havok will offload some of it's physics to the GPU.
One can say the same thing about blindly follow AMD but the differentiation is data and investigations -- I tend to let the IHV's counter each other:
http://www.hexus.net/content/item.php?item=20991
It is great to see counters on points.
The positive mention of DX11 is a rarity in recent communications from NVIDIA - except perhaps in their messaging that 'DirectX 11 doesn't matter'. For example I don't remember Jensen or others mentioning tessellation (they biggest of the new hardware features) from the stage at GTC
What special edition of the game would that be. The one where nvidia didn't exchange millions of dollars with EA to get rights to it? Because that did happen, you can easily verify it through google if you wish.
Please explain how broadly disabling AA on all non nvidia hardware is an ethical practice. Please tell me why EA would need *nvidia* to code AA. Coding AA isn't a difficult thing to do , and certainly doesn't require a graphics cards company. I find it hilarious that you're even defending that.
Furthermore, lets look at the hundreds of games with in game AA and how they were never influenced by GPU makers, but instead coded by the developers themselves. Face it, nvidia did this not for the good of the game, but just to put their hardware in a better light. Even if it wasn't ethical.
You can say what you want about AMD but they have never made malicious code to make games run WORSE on non AMD hardware, that i'm aware of. Nvidia has done this many times.
So i've been reading up on some of the shinanigans that nvidia has done on "nvidia - the way its meant to be played" games.
I can't believe the lengths nvidia has gone through to purposely hijack games to run worse on AMD hardware. I won't go into long details, but nvidia paid crytek US 2 million to delay their DX11 port, and nvidia implemented it *themselves* by applying tesselation on non visible, water, and flat objects. If you read up on this, these tactics were purposely done to hijack crysis 2 on anything that isn't nvidia hardware -- and it does not improve image quality AT ALL (on flat objects and water...and especially non visibile objects under the world -- yes they did that.)
Outright scumbaggery. They did the same with Batman: AA in 2009. This is the same company that in 2003 manipulated their drivers to disable features in 3d benchmarks to get inflated results - back then tomshardware called them out on it, and they admitted to cheating. Good to see that their cheating hasn't ended.
I can't believe they for the most part, haven't been called out on this.
I believe AA support for ATI was added to Batman AA through a patch a while back. I know the GOTY version has it.regarding batman: arkham asylum and AA:
the game merely detects if there's an nvidia device installed and enables AA, and doesn't even do a particularly good job of it. i recently installed a GT240 as a physX only card in my 5870 based system. not only did i gain access to hardware physX, i also gained ingame AA as well. it's just MSAA as far as i can tell.
Important note if you are testing the following applications:
Dawn of War 2 Empire Total War
Need for Speed: Shift
Oblivion
Serious Sam II
Far Cry 1
AMD has admitted that performance optimizations in their driver alters image quality in the above applications. The specific change involves demoting FP16 render targets to R11G11B10 render targets which are half the size and less accurate. The image quality change is subtle, but it alters the workload for benchmarking purposes. The correct way to benchmark these applications is to disable Catalyst AI in AMD's control panel. Please contact your local AMD PR representative if you have any doubts on the above issue.
NVIDIA's official driver optimization's policy is to never introduce a performance optimization via .exe detection that alters the application's image quality, however subtle the difference. This is also the policy of FutureMark regarding legitimate driver optimizations.
So when are you going to get rid of that Intel chip in your sig?So i've been reading up on some of the shinanigans that nvidia has done on "nvidia - the way its meant to be played" games.
I can't believe the lengths nvidia has gone through to purposely hijack games to run worse on AMD hardware. I won't go into long details, but nvidia paid crytek US 2 million to delay their DX11 port, and nvidia implemented it *themselves* by applying tesselation on non visible, water, and flat objects. If you read up on this, these tactics were purposely done to hijack crysis 2 on anything that isn't nvidia hardware -- and it does not improve image quality AT ALL (on flat objects and water...and especially non visibile objects under the world -- yes they did that.)
Outright scumbaggery. They did the same with Batman: AA in 2009. This is the same company that in 2003 manipulated their drivers to disable features in 3d benchmarks to get inflated results - back then tomshardware called them out on it, and they admitted to cheating. Good to see that their cheating hasn't ended.
I can't believe they for the most part, haven't been called out on this.
The bumpgate showed Nvidia's true colors when it comes to their own customers. I can't remember a HW company that behaved this bad when they screwed up. Compare to the AMD Phenom I fiasco, or Intel Sandy Bridge faulty chipsets. Both companies acted great.
Compare to the AMD Phenom I fiasco, or Intel Sandy Bridge faulty chipsets. Both companies acted great.
I don't know of any specific instance where ATI/AMD has specifically applied code to a game to make it run worse on nvidia hardware. DX:HR supports AMD features but does NOT have code to hijack it on nvidia hardware. It runs well on both platforms.
I could be wrong, but AMD seems much more ethical than nvidia. Nvidia seems to have a long history of scumbag practices from what i've been reading....almost makes me want to cancel my GTX 580 order for my 2nd PC.
Please explain to me why nvidia paid EA to disable in game AA in batman:AA for everything *except* nvidia hardware? This forces ATI users to use AA through drivers which is lower performance than in game AA. Driver AA applies AA to everything, while in game AA applies it where it is needed. There is no excuse for anti consumer practices like this.