AMD Radeon HD 6970 already benchmarked? Enough to beat GTX480 in Tesselation?

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

T2k

Golden Member
Feb 24, 2004
1,665
5
81
The lighting is clearly different in the AMD screenie versus the NV...just look at how saturated the guy's face is on the right (his left) in the AMD screenshot versus the Nvidia one. The Nvidia one keeps all the facial shadowing and structure, one big blob of a cheekbone in the AMD screenshot.

Personally it looks to me like someone juiced up the contrast in the AMD screenie after the fact, photoshop FTW, or did not set the same gamma for the game.

You guys might want to checkout BFG10K's article:
nVidia 400 Series Image Quality Analysis

by BFG10K



Without a doubt, we've even got a google translated source saying as much with a screenshot to prove it! What more proof does one need?

Keep in mind you are using a game in which NV was heavily involved including direct payments and developmen help - according to their slightly arrogant, pimp-faced chief dev guy they didn't even test it on ATI hardware before its release but they got several Fermi cards well before it was released.

In short it's probably the worst example for fair comparison unless you want to showcase how limited tessellation this NV+4A duo was able to achieve - something that's probably the result of an NV PR idea, not an original engine feature (they only use it for facial expressions, nothing else.)
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I haven't yet seen anyone invalidate any of my arguments... especially not you, Mr "I keep repeating that AMD has fixed tessellation while nVidia uses shaders" no matter how often that was debunked.

Errr, no, you did not debunk anything so far and I am still curious about your explanation of how "Half-Life 2 is a DX8.1 game with DX9 shaders" as you stated several times... :cool:;)


...anyone even remotely familiar with basic terms will facepalm after reading such hilarious comment and it's not much different with most of your comment about this subject either, I think. :awe:

And now you can go ahead and report me for... what for, exactly?

This entire thread-derail thread-crap campaign needs to stop.

Keep the scope of your postings in this thread on-topic or you will be infracted for thread-derail.

Moderator Idontcare
 
Last edited by a moderator:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
it was a user on chiphell... when they got into talking about metro2033, posted those.
I assume the guy has both a 480 and a 5850. There are some ingame shots from him that show it much better, I didnt post those because they where pretty big pictures so you could easier compair the quality.

The texture blur in the Nvidia shot is due to a bug that occurs in DX11 Very high with tessellation enabled.

I've actually been corresponding with 4A Games on this issue, so hopefully it will be fixed soon.

Whoever it was that took those screenshots, tell him to turn off tessellation and the blur will be removed.
 

formulav8

Diamond Member
Sep 18, 2000
7,004
523
126
So now its Tesellation that is the main factor in choosing a video card? Its no longer the overall performance/value that matters? I don't remember getting that memo....
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Turning off tesselation wouldn't make that wood look better.(ironic given how hard it's being pushed here)
The AMD card is superior in image quality.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Turning off tesselation wouldn't make that wood look better.(ironic given how hard it's being pushed here)
The AMD card is superior in image quality.

I said it's a BUG.

I took screenshots myself and sent them to 4A Games:

DX11 Very high Tessellation enabled

DX11 Very high Tessellation disabled.

As you can clearly see, the screenshot with tessellation enabled has the blur effect while the one without tessellation doesn't have it. The blur effect is especially evident on the wood panels and the soldier's polar neck.

For a TWIMTBP title, Metro 2033 has a lot of bugs on Nvidia hardware. I reported another bug over a month ago which was fixed, in that parallax occlusion mapping was not being enabled in the very high setting under DX11.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Carfax83 thanks for pointing that out.

Yeah it looks like nvidia users are better off not useing tessellation in Metro2033 if their textures end up this blurry. Ironic in this thread about promoting how great nvidia tessellation is lmao.

Carfax83 does this happend to nvidia cards in other games with tessellation?
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Errr, no, you did not debunk anything so far and I am still curious about your explanation of how "Half-Life 2 is a DX8.1 game with DX9 shaders" as you stated several times... :cool:;)

I said Half-Life 2 is a DX9 game with DX8.1 shaders, and linked to the Anandtech article explaining it: http://www.anandtech.com/show/1144/6
I say that is debunked very well.
Anandtech said:
The GeForce FX 5600 (NV31) uses a code path that is internally referred to as dx82; this path is a combination of DX9 (pixel shader 2.0) and DX8.1 (pixel shader 1.4) code, and thus, doesn't look as good as what you'll see on the 5900 Ultra.

Although the 5900 Ultra performs reasonably well with the special NV3x mixed mode path, the 5600 and 5200 cards do not perform well at all. Valve's recommendation to owners of 5600/5200 cards is to run the DX8 (pixel shader 1.4) code path in order to receive playable performance under Half-Life 2. The performance improvement gained by dropping to the DX8 code path is seen most on the GeForce FX 5200; although, there is a slight improvement on the 5600 as you can see below:



The sacrifices that you encounter by running either the mixed mode path or the DX8 path are obviously visual. The 5900 Ultra, running in mixed mode, will exhibit some banding effects as a result of a loss in precision (FP16 vs. FP32), but still looks good - just not as good as the full DX9 code path. There is a noticeable difference between this mixed mode and the dx82 mode, as well as the straight DX8 path. For example, you'll notice that shader effects on the water aren't as impressive as they are in the native DX9 path.

Are the visual tradeoffs perceptive? Yes. The native DX9 path clearly looks better than anything else, especially the DX8.0/8.1 modes.

QED.
I think the mods can take it from here. You've reiterated this misinformation and slander towards me at least once too many now. There are no excuses for your behaviour anymore, the link was already posted, and you deliberately ignore the facts.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
1 more question;

the image blurr thingy in Metro2033 on textures (with tessellation turned on), does that improve benchmarking for nvidia cards? and do people that benchmark it use this?

This probably sounds like a conspiracy theory but... could it be done on purpose to make nvidia cards benchmark better? it is a TWIMTBP title, so how could nvidia not know of this problem?
 
Last edited:

jones377

Senior member
May 2, 2004
467
70
91
tess_small.png


On the topic of this thread, could it be that Cayman stores tesselated geometry in the Video RAM and loads it the next frame(s) instead of using the tesselator all the time? Perhaps along with some sort of on-chip buffer or cache. This would allow getting good performance with a relatively weaker tesselator hardware while saving power at the same time. (Sort of like how the P4 uses a single decoder but has a trace cache that can deliver 3 ops/cycle)

It could also explain Barts tesselation performance if it only has the small buffer on-chip. It works as long as the total geometry doesn't exceed the buffer, meaning at lower tesselation factors. When it does, the tesselator goes to work again every frame leading to the same performance characteristics as Cypress, sans difference in clockspeed.

Does this make sense to anyone?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
1 more question;

the image blurr thingy in Metro2033 on textures, does that improve benchmarking for nvidia cards?

I doubt it. The blur effect was much worse when the game first came out, and although it's been reduced substantially over time via patches and what not, the performance penalty is still the same as it's always been.

Also, turning on 4xMSAA in very high mode results in blurry textures on AMD hardware as well..
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I said Half-Life 2 is a DX9 game with DX8.1 shaders,

Nope, it was the other way around but this version is even more silly - and dead wrong, of course. :cool:

and linked to the Anandtech article explaining it: http://www.anandtech.com/show/1144/6

Wow, did you read your own link? Aside of the fact that's an engine preview, not the final game, it's clearly says it uses proper PS2.0 codepath for every DX9 cards except NV's craptastic ultra-high-end FX5900 Ultra which got its own "DX8.2" mixed codepath and of course the miserable FX5200/5600 which weren't abel to run *any* DX9 codepath at acceptable framerate, sticking with the backward compatible DX8.1 codepath.

Anand laid it out very well and it pretty clearly shows you're simply wrong on all count, pal:

During Gabe Newell's presentation, he insisted that they [Valve] have not optimized or doctored the engine to produce these results. It also doesn't make much sense for Valve to develop an ATI-specific game simply because the majority of the market out there does have NVIDIA based graphics cards, and it is in their best interest to make the game run as well as possible on NVIDIA GPUs.

Gabe mentioned that the developers spent 5x as much time optimizing the special NV3x code path (mixed mode) as they did optimizing the generic DX9 path (what ATI's DX9 cards use). Thus, it is clear that a good attempt was made to get the game to run as well as possible on NVIDIA hardware.

To those that fault Valve for spending so much time and effort trying to optimize for the NV3x family, remember that they are in the business to sell games and with the market the way it is, purposefully crippling one graphics manufacturer in favor of another would not make much business sense.

Truthfully, we believe that Valve made an honest attempt to get the game running as well as possible on NV3x hardware but simply ran into other unavoidable issues (which we will get to shortly)

(...)

We briefly mentioned the Mixed Mode of operation for NV3x GPUs that Valve implemented in Half-Life 2, but there is much more to it than just a special NV3x code path. In fact, the mixed mode NV3x code path was really only intended for the GeForce FX 5900 Ultra (NV35). The mainstream FX chips (5200/5600) require a slightly different code path.


Here you can see the 40% performance boost NVIDIA gets from the special NV3x code path.

The GeForce FX 5600 (NV31) uses a code path that is internally referred to as dx82; this path is a combination of DX9 (pixel shader 2.0) and DX8.1 (pixel shader 1.4) code, and thus, doesn't look as good as what you'll see on the 5900 Ultra.

Although the 5900 Ultra performs reasonably well with the special NV3x mixed mode path, the 5600 and 5200 cards do not perform well at all. Valve's recommendation to owners of 5600/5200 cards is to run the DX8 (pixel shader 1.4) code path in order to receive playable performance under Half-Life 2.

Half-Life 2 was a full DX9 game, running full DX9 shaders on every DX9 cards, period.
Nvidia can only say thank you for Valve - makeshift, half-broken DX9 cards from Nvidia weren't able to do it in any acceptable way hence the help for them with all sorts of hackery by Valve.
Of course, it was properly thanked in a true Nvidia manner: they attacked Gabe Newell perosnally, Valve as a company and the game as well. Typical disgusting Nvidia behavior...

I say that is debunked very well.

I say you didn't even read your own link. :cool:


This entire thread-derail thread-crap campaign needs to stop.

Keep the scope of your postings in this thread on-topic or you will be infracted for thread-derail.

Moderator Idontcare
 
Last edited by a moderator:

Scali

Banned
Dec 3, 2004
2,495
1
0
Wow, did you read your own link? Aside of the fact that's an engine preview, not the final game, it's clearly says it uses proper PS2.0 codepath for every DX9 cards except NV's craptastic ultra-high-end FX5900 Ultra which got its own "DX8.2" mixed codepath and of course the miserable FX5200/5600 which weren't abel to run *any* DX9 codepath at acceptable framerate, sticking with the backward compatible DX8.1 codepath.

So effectively you're saying the GeForce FX series wasn't running the DX9 path by default? And the reason for that is the poor performance in the DX9 path, because of the poor SM2.0 implementation?
Funny, that is exactly what I said.

Half-Life 2 was a full DX9 game, running full DX9 shaders on every DX9 cards, period.

But you just said:
Wow, did you read your own link? Aside of the fact that's an engine preview, not the final game, it's clearly says it uses proper PS2.0 codepath for every DX9 cards except NV's craptastic ultra-high-end FX5900 Ultra which got its own "DX8.2" mixed codepath and of course the miserable FX5200/5600 which weren't abel to run *any* DX9 codepath at acceptable framerate, sticking with the backward compatible DX8.1 codepath.

Someone is contradicting themselves here, and I'm pretty sure it's not me.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Scali what he said was exceptions where made for nvidia cards, so they could play the game.

Your probably both saying the same thing and speaking past one another.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
So effectively you're saying the GeForce FX series wasn't running the DX9 path by default?

Errr, I quoted Anand saying the miserable 5200/5600-series wasn't recommended to run any DX9 while the uber-5900 ultra needed it's own "mixed mode" DX9 due its craptastic performance in pure PS2 mode.

And the reason for that is the poor performance in the DX9 path, because of the poor SM2.0 implementation?

If you mean Valve's implementation then no, absolutely not - it was the non-standard hackjob Nvidia tried to sell as DX9 in 5900 and its derivatives following the utter disaster of FX5800 aka Dustbuster against 9700 Pro. :)

Funny, that is exactly what I said.

No, you claimed Half-Life 2 was a DX9 game running DX8.1 shaders (or sometimes the other way around, irrelevant) - which is simply not true, pure and simple.

If you don't see the difference then I'm afraid I cannot help you and probably nobody here on this forum can, pal.

But you just said:


Someone is contradicting themselves here, and I'm pretty sure it's not me.

No, you're just trying to move the goalpost, as always when someone proves you are talking silly things. ;)
No worries, I don't take it personally... :cool:


This entire thread-derail thread-crap campaign needs to stop.

Keep the scope of your postings in this thread on-topic or you will be infracted for thread-derail.

Moderator Idontcare
 
Last edited by a moderator:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
@Scali what he said was exceptions where made for nvidia cards, so they could play the game.

Well, back then first he said it was a DX8.1 game running DX9 shaders or vica versa - both are false as it was a fully DX9 game. The fact that it had, like every other game, a backward-compatible codepath was rather an NV-only favor because Radeon 8500 owners would get this game for free anyway if they would upgrade to a Radeon 9500-9700-series card...

Your probably both saying the same thing and speaking past one another.

Now he does, of course - good ol' basic forum-trickery. :thumbsup::cool:


This entire thread-derail thread-crap campaign needs to stop.

Keep the scope of your postings in this thread on-topic or you will be infracted for thread-derail.

Moderator Idontcare
 
Last edited by a moderator:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Well, back then first he said it was a DX8.1 game running DX9 shaders or vica versa - both are false as it was a fully DX9 game. The fact that it had, like every other game, a backward-compatible codepath was rather an NV-only favor because Radeon 8500 owners would get this game for free anyway if they would upgrade to a Radeon 9500-9700-series card...



Now he does, of course - good ol' basic forum-trickery. :thumbsup::cool:

Could you link me to that post?
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Could you link me to that post?

I think it started with this one:
http://forums.anandtech.com/showpost.php?p=30633745&postcount=116

And/or this one:
http://forums.anandtech.com/showpost.php?p=30648804&postcount=322

Clearly I am speaking of a DX9 game using a DX8.1 path on GeForce FX, because SM2.0 performance is not good enough.

T2k has been attacking me on that point ever since, for no obvious reason.
Posts like this:
http://forums.anandtech.com/showthread.php?p=30633873&highlight=dx8#post30633873

I've expained it multiple times, liked to the Anandtech article... but he keeps going.
Clearly I have always said exactly the same, which is also what the Anandtech article says.
Which should be obvious anyway... as a developer with hands-on experience with DX8 and DX9 on both Radeons and GeForce FX, I wouldn't make mistakes on basic subjects such as this one.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
You just shot yourself in the foot


What you said:
"Half-Life 2 doesn't have a GeForce FX 'optimized' path. It just runs a DX8.1 path for pretty much everything. It basically runs the Radeon 8500-optimized path"

What the article above says:
"Gabe mentioned that the developers spent 5x as much time optimizing the special NV3x code path (mixed mode) as they did optimizing the generic DX9 path (what ATI's DX9 cards use)."


So really, what will you come up with now?

This entire thread-derail thread-crap campaign needs to stop.

Keep the scope of your postings in this thread on-topic or you will be infracted for thread-derail.

Moderator Idontcare
 
Last edited by a moderator:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
You just shot yourself in the foot


What you said:
"Half-Life 2 doesn't have a GeForce FX 'optimized' path. It just runs a DX8.1 path for pretty much everything. It basically runs the Radeon 8500-optimized path"

What the article above says:
"Gabe mentioned that the developers spent 5x as much time optimizing the special NV3x code path (mixed mode) as they did optimizing the generic DX9 path (what ATI's DX9 cards use)."


So really, what will you come up with now?

What are you arguing about? Looks like Scali is saying the NV30 sucked balls in DX9 and they used a different codepath so it could run the game. What is your point? You think they were using a DX9 codepath for the NV30? It sounds to me like from Gabe's own comments they probably spent 5x time optimizing a mixed mode path. Meaning they were probably interjecting DX9 where it was possible and dropping back to an 8.1 codepath for the rest on the card.

This is really a stupid argument to watch unfold. Especially considering who keeps instigating it. Really be nice if he went on vacation again. The board was much better when him and wreckage were sent packing.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
It obviously is, let alone he couldn't explain anything hence using Scali's (false) arguments as a cover - that's even more juvenile, I think.



In the future - yes, obviously, thanks to DX11.
In the immediate future - no, obviously, thanks to DX11.

In short it will shine on next-gen cards and this is why AMD was so smart when they didn't bother with tessellation in 58xx-series: they knew they had the 69xx-series in the pipeline a year later and nobody will write a new engine in less than 2-3 years so for countering NV's PR nonsense there's 69xx-series from now and real games with heavy tessellation won't be in before late 2011-2012 ie nothing en masse before next-gen CryEngine, UnrealEngine, Frostbite, X-Ray etc implement it (probably NV-supported CryEngine 3 will be the first one to make heavy use of it.)

Errr, no, you did not debunk anything so far and I am still curious about your explanation of how "Half-Life 2 is a DX8.1 game with DX9 shaders" as you stated several times... :cool:;)


...anyone even remotely familiar with basic terms will facepalm after reading such hilarious comment and it's not much different with most of your comment about this subject either, I think. :awe:

And now you can go ahead and report me for... what for, exactly?

Nope, it was the other way around but this version is even more silly - and dead wrong, of course. :cool:



Wow, did you read your own link? Aside of the fact that's an engine preview, not the final game, it's clearly says it uses proper PS2.0 codepath for every DX9 cards except NV's craptastic ultra-high-end FX5900 Ultra which got its own "DX8.2" mixed codepath and of course the miserable FX5200/5600 which weren't abel to run *any* DX9 codepath at acceptable framerate, sticking with the backward compatible DX8.1 codepath.

Anand laid it out very well and it pretty clearly shows you're simply wrong on all count, pal:



Half-Life 2 was a full DX9 game, running full DX9 shaders on every DX9 cards, period.
Nvidia can only say thank you for Valve - makeshift, half-broken DX9 cards from Nvidia weren't able to do it in any acceptable way hence the help for them with all sorts of hackery by Valve.
Of course, it was properly thanked in a true Nvidia manner: they attacked Gabe Newell perosnally, Valve as a company and the game as well. Typical disgusting Nvidia behavior...



I say you didn't even read your own link. :cool:

Errr, I quoted Anand saying the miserable 5200/5600-series wasn't recommended to run any DX9 while the uber-5900 ultra needed it's own "mixed mode" DX9 due its craptastic performance in pure PS2 mode.



If you mean Valve's implementation then no, absolutely not - it was the non-standard hackjob Nvidia tried to sell as DX9 in 5900 and its derivatives following the utter disaster of FX5800 aka Dustbuster against 9700 Pro. :)



No, you claimed Half-Life 2 was a DX9 game running DX8.1 shaders (or sometimes the other way around, irrelevant) - which is simply not true, pure and simple.

If you don't see the difference then I'm afraid I cannot help you and probably nobody here on this forum can, pal.



No, you're just trying to move the goalpost, as always when someone proves you are talking silly things. ;)
No worries, I don't take it personally... :cool:

Well, back then first he said it was a DX8.1 game running DX9 shaders or vica versa - both are false as it was a fully DX9 game. The fact that it had, like every other game, a backward-compatible codepath was rather an NV-only favor because Radeon 8500 owners would get this game for free anyway if they would upgrade to a Radeon 9500-9700-series card...



Now he does, of course - good ol' basic forum-trickery. :thumbsup::cool:

You just shot yourself in the foot


What you said:
"Half-Life 2 doesn't have a GeForce FX 'optimized' path. It just runs a DX8.1 path for pretty much everything. It basically runs the Radeon 8500-optimized path"

What the article above says:
"Gabe mentioned that the developers spent 5x as much time optimizing the special NV3x code path (mixed mode) as they did optimizing the generic DX9 path (what ATI's DX9 cards use)."


So really, what will you come up with now?

This entire thread-derail thread-crap campaign needs to stop.

Keep the scope of your postings in this thread on-topic or you will be infracted for thread-derail.

Moderator Idontcare
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I think it started with this one:
http://forums.anandtech.com/showpost.php?p=30633745&postcount=116

And/or this one:
http://forums.anandtech.com/showpost.php?p=30648804&postcount=322

Clearly I am speaking of a DX9 game using a DX8.1 path on GeForce FX, because SM2.0 performance is not good enough.

T2k has been attacking me on that point ever since, for no obvious reason.
Posts like this:
http://forums.anandtech.com/showthread.php?p=30633873&highlight=dx8#post30633873

I've expained it multiple times, liked to the Anandtech article... but he keeps going.
Clearly I have always said exactly the same, which is also what the Anandtech article says.
Which should be obvious anyway... as a developer with hands-on experience with DX8 and DX9 on both Radeons and GeForce FX, I wouldn't make mistakes on basic subjects such as this one.

I know, I just wanted to see if he could post anything to back up his claims...he couldn't...buisness as usual...lot of Ad Hominem...but no substance ;)
 

Scali

Banned
Dec 3, 2004
2,495
1
0
On the topic of this thread, could it be that Cayman stores tesselated geometry in the Video RAM and loads it the next frame(s) instead of using the tesselator all the time? Perhaps along with some sort of on-chip buffer or cache. This would allow getting good performance with a relatively weaker tesselator hardware while saving power at the same time. (Sort of like how the P4 uses a single decoder but has a trace cache that can deliver 3 ops/cycle)

In theory they could do that, but it would be cheating.
Namely, as AMD themselves promote tessellation: it should be adaptive.
That means that the level of tessellation is determined at runtime by parameters such as the distance or the projected size on screen.
This can (and generally will) change at every frame, so trying to buffer the geometry is pretty useless.... Unless ofcourse you're going to cheat and not actually going to do adaptive tessellation every frame, but just every X frames, re-using the buffered geometry for the remaining X-1 frames, rather than generating a proper adaptive set of geometry.