• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Deus Ex Performance Thread

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
What are you talking about?

DE3 is a Gaming Evolved title and it does support GLORIOUS HD3D
:whiste:

And i would be willing to bet that within 2 weeks Nvidia will have working drivers for 3D Vision for DE3; the devs didn't lock Nvidia out, they just didn't enable S3D for Nvidia cards. i wonder why they didn't bother?
()🙂

3D Vision drivers should be ready about the time for my 3rd playthrough - after playing it again in 3x1 Eyefinity with multi-GPU (i'd have to build a frame for 6x2 as my 6th LCDs is only 22" and would look out of place amongst the 5x24 inchers)
:wub:

Anyone else finding a single HD 6970 a bit weak for playing DeuSex at 2560x1600?
Ha i see what you did hear:sneaky:
 
Ha i see what you did hear:sneaky:
i really don't hear anything and i won't press my contacts for info
- let's just say that i deduce it from the "chatter"

The guys that have been playing in S3D since the Revelator glasses have got 3D Vision half-a$$ed working in DE3 and they are the ones the believe Nvidia can fix it in the drivers. Of course Nvidia will do to AMD what was done to them as they cry "unfair" - and then when the drivers they are probably working on now are ready for 3D Visions, they will proclaim superiority again.

Modus Operandi


No wonder *both* companies are evidently unhappy with me.
:whiste:
 
Last edited:
i really don't hear anything and i won't press my contacts for info
- let's just say that i deduce it from the "chatter"

The guys that have been playing in S3D since the Revelator glasses have got 3D Vision half-a$$ed working in DE3 and they are the ones the believe Nvidia can fix it in the drivers. Of course Nvidia will do to AMD what was done to them as they cry "unfair" - and then when the drivers they are probably working on now are ready for 3D Visions, they will proclaim superiority again.

Modus Operandi


No wonder *both* companies are evidently unhappy with me.
:whiste:
Jeez calm down apop,paranoid much?
 
5 frames per second is not close?

77 vs 82?

A 5fps difference is pretty big when we're talking CPUs. Considering an i7-920@2.0 was able to sustain 73fps and a stock 920 could do 74fps, I'd say the payback for an overclock to 3.8 isn't very impressive.

Heck, even the stock PII X4-980 can beat a heavily overclocked i7-920.
 
I understand what you are saying but 5 fps won't make a difference in how the games plays nor will it allow me to increase the graphic settings so on a graph its something to take note of but when actually playing this game it won't be noticed.

I can see 10fps making a difference when you are talking 20 vs 30fps but 77 vs 82???

I don't believe the difference is as large as your post made it seem.

As for the stock Phenom Quad beating a heavily overclocked i7 I would have to see this to believe it because its goes against everything i've seen in the past 3 years.
 
I understand what you are saying but 5 fps won't make a difference in how the games plays nor will it allow me to increase the graphic settings so on a graph its something to take note of but when actually playing this game it won't be noticed.

I can see 10fps making a difference when you are talking 20 vs 30fps but 77 vs 82???

I don't believe the difference is as large as your post made it seem.

As for the stock Phenom Quad beating a heavily overclocked i7 I would have to see this to believe it because its goes against everything i've seen in the past 3 years.
5fps will not make a difference if going from 50 to 55fps, but it can make a difference from 20 to 25 or 25 to 30.
 
5fps will not make a difference if going from 50 to 55fps, but it can make a difference from 20 to 25 or 25 to 30.

I agree, but the numbers in question are 77 vs 82 I already pointed out at lower fps it will make a bigger difference.
 
5fps will not make a difference if going from 50 to 55fps, but it can make a difference from 20 to 25 or 25 to 30.

I agree, but the numbers in question are 77 vs 82 I already pointed out at lower fps it will make a bigger difference.

Fair enough - I've edited the original post.

I guess the more important thing than the raw fps is the minimal benefit of overclocking. That was probably the most surprising thing, given that there was a cpu limitation (i.e., the stock 2500k was way ahead of the stock 920).
 
Fair enough - I've edited the original post.

I guess the more important thing than the raw fps is the minimal benefit of overclocking. That was probably the most surprising thing, given that there was a cpu limitation (i.e., the stock 2500k was way ahead of the stock 920).

Agree with you here I was surprised that overclocking didn't do much for this game, they didn't really show overclocked numbers for SB so don't know if it would see a bigger benefit there.

Given the clock speed difference and slight IPC difference I expect 2500k vs 920 stock to be ahead but at close to the same clock speed for games it doesn't show a huge difference.

One of the things I do like about this game is its seems to have been optimized properly and runs well on most hardware while still offering DX11.

If only everyone was doing it!
 
Agree with you here I was surprised that overclocking didn't do much for this game, they didn't really show overclocked numbers for SB so don't know if it would see a bigger benefit there.

Given the clock speed difference and slight IPC difference I expect 2500k vs 920 stock to be ahead but at close to the same clock speed for games it doesn't show a huge difference.

One of the things I do like about this game is its seems to have been optimized properly and runs well on most hardware while still offering DX11.

If only everyone was doing it!
there is a massive thread about stuttering on the Steam forums. it seems to be an issue with the way the game only caches stuff on the fly or something like that.
 
One of the things I do like about this game is its seems to have been optimized properly and runs well on most hardware while still offering DX11.

If only everyone was doing it!

We can only have this kind of performance only if the game doesn't stress the hardware, witch means that DE3 doesn't implement a lot of tessellation and perhaps other DX-11 features.

No matter how efficient you make your code, performance will not be the same if you tessellate 5 objects and then 20.
Since we dont have any info to the contrary as of now, i believe DE3 only put a "mild" tessellation on humanoid models only, making it easier for the hardware.
 
there is a massive thread about stuttering on the Steam forums. it seems to be an issue with the way the game only caches stuff on the fly or something like that.

I haven't seen this yet good to know.

We can only have this kind of performance only if the game doesn't stress the hardware, witch means that DE3 doesn't implement a lot of tessellation and perhaps other DX-11 features.

No matter how efficient you make your code, performance will not be the same if you tessellate 5 objects and then 20.
Since we dont have any info to the contrary as of now, i believe DE3 only put a "mild" tessellation on humanoid models only, making it easier for the hardware.

I think this is acceptable because DX11 is still pushing it way through and most games are still only dx9 or 10. I would rather each new game slowly start to use these features than just one game that comes that has everything maxed out which then requires us to have videocards that reguire 4th gen tessellation units, and you have to wait for ati or NV to released a next gen part just to play it.

Its ok to have all these extra settings and features but the number one priority I believe for these dev's is make the games playable on a huge numbers of configurations. It sucks for us who like to keep up to date with our hardware but tis life.
 
Last edited:
We can only have this kind of performance only if the game doesn't stress the hardware, witch means that DE3 doesn't implement a lot of tessellation and perhaps other DX-11 features.

No matter how efficient you make your code, performance will not be the same if you tessellate 5 objects and then 20.
Since we dont have any info to the contrary as of now, i believe DE3 only put a "mild" tessellation on humanoid models only, making it easier for the hardware.

I'll take "mild" tessellation that's actually visible over "a lot" of tessellation of flat and invisible objects.
 
I think this is acceptable because DX11 is still pushing it way through and most games are still only dx9 or 10. I would rather each new game slowly start to use these features than just one game that comes that has everything maxed out which then requires us to have videocards that reguire 4th gen tessellation units, and you have to wait for ati or NV to released a next gen part just to play it.

Its ok to have all these extra settings and features but the number one priority I believe for these dev's is make the games playable on a huge numbers of configurations. It sucks for us who like to keep up to date with our hardware but tis life.

One of the biggest differences of playing in PC vs Consoles is that you have the ability to customize the settings of the game, we can choose what resolution we want to play and how high the graphics of the game would be, so if you have a lower end card you can actually play the game.

I would like developers to do the same think with Tessellation and give us the ability to change the Tessellation Level inside the Game Settings, so if your card will not be able to handle it you could be able to lower it, just like AMDs Tessellation bar in the drivers.

This way, people who have higher end cards can enjoy higher levels of Tessellation and graphics in general and people who have lower end cards can play the game too.

I'll take "mild" tessellation that's actually visible over "a lot" of tessellation of flat and invisible objects.

Relax, im not talking about DE3 vs Crysis 2 Tessellation but in general 😉
 
Ideally, would desire developers to offer scalable engines and features that simply scale depending on the individuals hardware -- either AMD or nVidia.

If one has the higher end hardware, they are rewarded with either more performance or goodies with playable frame-rate.. The idea -- max settings for all doesn't push the envelope for me and more closer to console gaming.
 
One of the biggest differences of playing in PC vs Consoles is that you have the ability to customize the settings of the game, we can choose what resolution we want to play and how high the graphics of the game would be, so if you have a lower end card you can actually play the game.

I would like developers to do the same think with Tessellation and give us the ability to change the Tessellation Level inside the Game Settings, so if your card will not be able to handle it you could be able to lower it, just like AMDs Tessellation bar in the drivers.

This way, people who have higher end cards can enjoy higher levels of Tessellation and graphics in general and people who have lower end cards can play the game too.



Relax, im not talking about DE3 vs Crysis 2 Tessellation but in general 😉

I think this is a great idea and wish they would implement this ASAP. If game development didn't keep heading down the we need to dumb everything down for the console crowd because going into menu's and changing settings and knowing what those settings do is alittle to complex for their minds.....
 
Ideally, would desire developers to offer scalable engines and features that simply scale depending on the individuals hardware -- either AMD or nVidia.

If one has the higher end hardware, they are rewarded with either more performance or goodies with playable frame-rate.. The idea -- max settings for all doesn't push the envelope for me and more closer to console gaming.
It can't be done yet on dynamic tessellation as the tessellation changes the look of the image. Object may appears to be disjointed, like a broken doll if it is not displayed at the same tessellation levels.
 
there is a massive thread about stuttering on the Steam forums. it seems to be an issue with the way the game only caches stuff on the fly or something like that.
(strangely) i have no stuttering to report for either my GTX 580 or my HD 6970. 😛

at 2560x1600 i don't get the smoothness that HardOCP reports for a single HD 6970 (i completed the game using it). There are slowdowns in graphically intense scenes - no stuttering to report, however.

i wonder what bench HardOCP used. The scene that AMD suggests isn't very demanding.

Who are reporting these issues? Anything in common in their systems?

Maybe they should try another MB
:biggrin:
 
It can't be done yet on dynamic tessellation as the tessellation changes the look of the image. Object may appears to be disjointed, like a broken doll if it is not displayed at the same tessellation levels.

Why can't a lower quality tessellation maximum setting differ from a higher quality maximum tessellation setting or with a higher setting offer more areas with tessellation quality and a lower setting offer less areas with tessellation?
 
Is my 9600GT DX11? I ask because in the video options, it says DX11 and there is no drop down menu to change it- Like it is locked on DX11. I thought you had the option of DX9 or 11 even on the newer cards.
 
Is my 9600GT DX11? I ask because in the video options, it says DX11 and there is no drop down menu to change it- Like it is locked on DX11. I thought you had the option of DX9 or 11 even on the newer cards.

As far as I know, the game does not have a DX9. It'll run DX10... Which is what the 9600GT is.
 
Back
Top