• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Deus Ex Performance Thread

No surprises here; it is an AMD evolved title. Give Nvidia a few days to catch up with optimizations.
 
AMD pulled out an 8% performance lead and Nvidia offers slightly better image quality when certain features are used. However, this was merely initial testing and a more complete review will be released when we have a retail copy of the game. So until then, take this for what its worth.

That would be incorrect. Both video cards can use FXAA so the image quality is the same.
 
This title also has native support for HD3d as well and works with AMD 5 and 6 series products. Since it is native to AMD; it doesn't work with nVidia GPU's and 3d vision.
 
No surprises here; it is an AMD evolved title. Give Nvidia a few days to catch up with optimizations.

And also to be fair here, the conclusion was not decisive for either AMD or Nvidia, as was clearly stated in the [H] conclusion
 
AMD is getting more and more aggressive and is starting to pull nvidia style guerrilla tactics on the huge new PC game releases.

The 6970 is about 10% faster in Deus HX: HR and over $100 cheaper.


1314036316SmpJUIUWGY_2_1.gif



Looks like gaming evolved is evolving into AMD's version of twimtbp.

Deus Ex: Human Revolution With AMD at SDCC 2011


Away from the benchmark flexing, this game is amazing. Nice to see a PC franchise done some justice on the PC rather than trashed for consoles like some others (Crysis 2) 😎
 
That would be incorrect. Both video cards can use FXAA so the image quality is the same.


Actually you make a good point and one that I missed during the article because it was not made clear.

The way I read the article the first time through made it seem like FXAA was Nvidia's option during testing, and MLAA was what they used during AMD testing. Just reading the article clearly gave me the impression that FXAA was an Nvidia specific thing and MLAA was AMD's equivalent and exlcusive option.

After reading the article again, it is clear that is not the case but that FXAA is just Nvidia's option for post-processing and MLAA is AMD's option, and that both platforms can use both of these technologies.

Maybe I am the only one that misread that or thought the wording was odd. But when they say things like Nvidia's Fxaa, and mention that Fxaa uses Nvidia shaders, it really made me think it was an Nvidia exclusive option.

So I guess that does make the conclusion I stated incorrect!
 
Actually you make a good point and one that I missed during the article because it was not made clear.

The way I read the article the first time through made it seem like FXAA was Nvidia's option during testing, and MLAA was what they used during AMD testing. Just reading the article clearly gave me the impression that FXAA was an Nvidia specific thing and MLAA was AMD's equivalent and exlcusive option.

After reading the article again, it is clear that is not the case but that FXAA is just Nvidia's option for post-processing and MLAA is AMD's option, and that both platforms can use both of these technologies.

Maybe I am the only one that misread that or thought the wording was odd. But when they say things like Nvidia's Fxaa, and mention that Fxaa uses Nvidia shaders, it really made me think it was an Nvidia exclusive option.

So I guess that does make the conclusion I stated incorrect!

You can also force FXAA on the AMD platform in many, many titles, using a third party tool -- - works with older AMD product sku's as well. Less of a hit than MLAA, especially at higher resolutions and EyeFinity resolutions.
 
After reading the article again, it is clear that is not the case but that FXAA is just Nvidia's option for post-processing and MLAA is AMD's option, and that both platforms can use both of these technologies.

Yep, you can use either so long as both are implemented in the game itself. I actually had to read the comments on the article to get clear on that, so I'm not surprised you didn't catch it the first time around.
 
This is about the only time, I've seen the 6970 test 20% faster than the 6950 at the same settings.
 
I thought this little part was interesting:

Using the GeForce GTX 570 with "FXAA High" at 2560x1600, we experienced odd, uncomfortable slowdowns in some corridors in the test scenario. Framerates would dip below 30FPS for a while and then recover for no apparent reason. Choosing the "Edge AA" anti-aliasing option alleviated a lot of that behavior, but not all. The game was certainly playable with those settings, and we will look into this issue in greater detail in our full performance evaluation of this game.

I really think this is more of a driver issue than a hardware one.
 
Last edited:
If this was a TWIMTBP title, and the gtx580 was 20% faster, there would be about a dozen people freaking out about how Nvidia is cheating and purposely hurting the game's performance on AMD cards.

Anyways, good for AMD. So instead of saying their customers don't want AA and then change abruptly change their minds when their competitor gives their customers the option of doing whatever they want, (see Starcraft 2's release) they have figured out it's much better to work with developers and come out of the gate swinging. They get the limelight for a little bit. It happened with Dragon Age 2, http://amd-member.com/Newsletters/AMDGame/March11.html , and it is happening now with Deus Ex. I fully expect Nvidia to pull clearly ahead in performance, as they did in Dragon Age 2, in the next month or so.
 
Last edited:
Hardocp always seems to test games with top-of-the-line hardware at 2560x1600, even if the average frame rates are not what I would consider to be smooth gameplay. This got me thinking... I wonder how many people actually have monitors with that resolution. So I created a poll on this forum to see what everyone is playing at: http://forums.anandtech.com/showthread.php?t=2186935 Go vote!
 
Hardocp always seems to test games with top-of-the-line hardware at 2560x1600, even if the average frame rates are not what I would consider to be smooth gameplay. This got me thinking... I wonder how many people actually have monitors with that resolution. So I created a poll on this forum to see what everyone is playing at: http://forums.anandtech.com/showthread.php?t=2186935 Go vote!

i agree and voted in your poll. as i stated there id imagin 1080p will eventually win out as they are all the rage now

id like to see the game benchmarked on a wider range of cards, sure someone will do it
 
As other have pointed out, it's probably a driver optimization problem with nVidia, we'll see how it plays out later on.
 
Hardocp always seems to test games with top-of-the-line hardware at 2560x1600, even if the average frame rates are not what I would consider to be smooth gameplay. This got me thinking... I wonder how many people actually have monitors with that resolution. So I created a poll on this forum to see what everyone is playing at: http://forums.anandtech.com/showthread.php?t=2186935 Go vote!


40~44 avg fps @2560x1600 with 16xAF and FXAA high setting (with a single card).

Means anyone can play this game, with good mix of eye candy and get a enjoyable play experiance. If you dont own a 580 or 6970, just turn things down a tiny bit, and your good to go.

Also notice the comments about the use of tessellation? not excessivly used, and used in places where it makes sense. Game ends up looking great and running fast on more or less any machine.

In otherwards not a TWIMTBP for AMD.
Because title runs well even with a single card, and at resolutions of 2560x1600 with everything on.
AMD unlike nvidia doesnt resort to that sort of thing, this is just a case of nvidia not haveing a chance yet to optimise their drivers, and AMD haveing because theyve been with them on the development of the game (so AMD drivers are further along at present).
 
Last edited:
All I can says is that this game is friggen awsome so far. I'm about 30 minutes in and loving it.

I've always liked the reviews at [H]ardocp. They will do a further review on performance with more resolutions in the followup article. This quick look was just to get some initial impressions and thoughts out.

It's clear that AMD has the winning ticket for Deus Ex: Human Revolution at launch. Good for AMD and i'm glad that this game allows me to use Tessellation and Depth of field without killing my frame rate. I'm glad this isn't an nVidia sponsored game to be honest, nVidias involvement would have likely leveraged their tessellation performance by screwing with it's implementation in the game to a dubious degree.

With the rig in my sig I'm playing comfortably at 1080P.
 
Last edited:
It's clear that AMD has the winning ticket for Deus Ex: Human Revolution at launch. Good for AMD and i'm glad that this game allows me to use Tessellation and Depth of field without killing my frame rate. I'm glad this isn't an nVidia sponsored game to be honest, nVidias involvement would have likely leveraged their tessellation performance by screwing with it's implementation in the game to a dubious degree.

QFT.
 
All I can says is that this game is friggen awsome so far. I'm about 30 minutes in and loving it.

I've always liked the reviews at [H]ardocp. They will do a further review on performance with more resolutions in the followup article. This quick look was just to get some initial impressions and thoughts out.

It's clear that AMD has the winning ticket for Deus Ex: Human Revolution at launch. Good for AMD and i'm glad that this game allows me to use Tessellation and Depth of field without killing my frame rate. I'm glad this isn't an nVidia sponsored game to be honest, nVidias involvement would have likely leveraged their tessellation performance by screwing with it's implementation in the game to a dubious degree.

With the rig in my sig I'm playing comfortably at 1080P.

You mean leverage their Hd3d, so it is native to AMD -- I can't use 3d stereo at all on a nVidia GPU in this title. nVidia's 3d vision ready titles work great with IZ3d or DDD on AMD hardware.
 
As other have pointed out, it's probably a driver optimization problem with nVidia, we'll see how it plays out later on.

Problem? No, Dragon Age 2 had a driver optimization problem to the point that a GTX 580 couldn't even match an HD 6870. I don't think there is a driver problem with Nvidia, the game was just made to work better with AMD drivers on its end. Yes Nvidia can optimize, but AMD will probably optimize its drivers over the next couple of releases too.
 
we are walking down a great path in pc gaming, started by the almighty nvidia. get ready to buy 2 gfx cards in the future and dual boot.
 
Back
Top