([H]ardOCP) Dirt 3 gameplay review. Radeon vs GTX

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So you really went for the "nvidia has better image quality than amd" argument?

I like how you're trying to put words in my mouth and take the discussion down a different path :sneaky:

I never said that Nvidia has better image quality than AMD.. I said that AMD's default driver settings is more focused on performance, compared to Nvidia's.

In truth, setting AMD's drivers to maximum quality will probably look very similar to Nvidia's max quality settings (slightly worse imo).....but the performance is noticeably less than using the default setting according to what I've read since the various filtering optimizations will now be turned off..

AMD uses aggressive optimizations in it's default driver settings (which most reviewers use) to gain a few FPS here and there in benchmarks, where as Nvidia doesn't in it's default setting. Nvidia has a specific optimization which isn't enabled in the default setting which is similar to what AMD has done.

What I find funny though is that AMD never had any intention of providing it's customers with a way of turning off these optimizations......until they were caught LOL! :awe:

Fiasco is such a funny word... How are the 580s holding up? I mean those things probably cost about 200$ more each and yet they still lose out to the 6970 in the resolutions where you`d need that kind of juice.

They're holding up fine and perform great at the resolution I play at; which is 2560x1600... You couldn't pay me to use the turd that is Crossfire again though, regardless of how much money I'd save..

And FYI, I never paid 500 for mine....more like 400 since I bought them from a friend ^_^
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Why is it that HardOCP's benchmarks always seem to be way off compared to other websites? Is it because of the driver settings they use? It's well known that AMD's default driver settings is geared towards performance more so than quality, unlike Nvidia's.

And Benchmark3D's benches show the game running faster on dual cores than on quad cores, which is in direct conflict with techspot's and pcgameshardware's review.


If you actually read their reviews, you would know.

This is not "well known" and it is also false. Both have their strengths and weaknesses, but overall image quality between the two is often impossible to tell apart.

^^This.



Yes I know they have a different testing methodology and don't use scripted benchmarks, but still, I don't think it would account for such a large difference.



Right... you must have missed the fiasco where AMD was caught using FP16 demotion in games to boost performance, among other things, and led to them being forced to implement a way to turn off these optimizations in the drivers as no such option had existed before hand..

There's a reason why some websites now use high quality settings for AMD's drivers as adversed to the default settings.

But I'm sure you won't let these minor details sway your beliefs..

You ask a question, someone answers it, and then you tell them off. Classic. :rolleyes:

I advise to take AnandThenMan's advice and read the review. They talk about some pretty big differences in performance by changing IQ settings (ambient occlusion and shadows, for example) . Or, are you implying it's some sort of driver cheats that [H] is exploiting by bringing up the BS about AMD drivers? Unless of course you have some proof that none of us know about? Don't bother linking to something unless it's with the same drivers as [H] is using, if you want to claim some sort of driver/performance manipulation.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You ask a question, someone answers it, and then you tell them off. Classic. :rolleyes:

You call that telling him off? :\

I advise to take AnandThenMan's advice and read the review. They talk about some pretty big differences in performance by changing IQ settings (ambient occlusion and shadows, for example)

And? The point I was making is that compared to other review sites that conducted benchmarks on Dirt 3, HardOCP's seems to be off by quite a margin....

And then you reply by saying the discrepancies are due to IQ settings? As far as I know, minus the excessive AA, TechSpot used max quality settings in their review, so the settings alone shouldn't be able to account for the discrepancies.

And I'm talking about their APPLES TO APPLES performance, not highest playable in case you didn't realize..

Or, are you implying it's some sort of driver cheats that [H] is exploiting by bringing up the BS about AMD drivers?

I doubt HardOCP themselves would knowingly exploit driver cheats to make AMD look good, but it's a known fact that AMD's filtering optimizations can add as much as 10% increase in performance in some situations, so I'm wondering if the reported performance differences between HardOCP and the other websites may be due to them using the default settings in the driver panel.....which uses the aggressive filtering optimizations.

Unless of course you have some proof that none of us know about? Don't bother linking to something unless it's with the same drivers as [H] is using, if you want to claim some sort of driver/performance manipulation.

I don't care enough to look for, much less provide evidence that AMD is cheating in Dirt 3.

I'm merely postulating, based on known facts concerning AMD's use of filtering optimizations.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You call that telling him off? :\



And? The point I was making is that compared to other review sites that conducted benchmarks on Dirt 3, HardOCP's seems to be off by quite a margin....

And then you reply by saying the discrepancies are due to IQ settings? As far as I know, minus the excessive AA, TechSpot used max quality settings in their review, so the settings alone shouldn't be able to account for the discrepancies.

And I'm talking about their APPLES TO APPLES performance, not highest playable in case you didn't realize..



I doubt HardOCP themselves would knowingly exploit driver cheats to make AMD look good, but it's a known fact that AMD's filtering optimizations can add as much as 10% increase in performance in some situations, so I'm wondering if the reported performance differences between HardOCP and the other websites may be due to them using the default settings in the driver panel.....which uses the aggressive filtering optimizations.



I don't care enough to look for, much less provide evidence that AMD is cheating in Dirt 3.

I'm merely postulating, based on known facts concerning AMD's use of filtering optimizations.

Until you show there is in fact any optimizations that are lowering quality in the drivers that were used you're jumping to conclusions. Although, I'm sure that if one site turned off game optimizations that would make a difference for any card, regardless of vendor.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
And? The point I was making is that compared to other review sites that conducted benchmarks on Dirt 3, HardOCP's seems to be off by quite a margin....

.

Could it be because hardocp is using their own custom game run through whereas other sites are using the built in demo?
 

wirednuts

Diamond Member
Jan 26, 2007
7,121
4
0
I want to see a review of Dirt 3 using older hardware like the 4870 and 4850. What do you think my chances are? The old Powercolor Radeon HD 4870 PCS 1GB just eats Duke Nukem for breakfast lol ;)

the 4870 still eats almost any game for breakfast at 1080p. i just tried dirt3 on my i3 3ghz machine, with a 1gb 4870 and only 2gb of ram. runs perfect at max settings in 1080p resolution.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
the 4870 still eats almost any game for breakfast at 1080p. i just tried dirt3 on my i3 3ghz machine, with a 1gb 4870 and only 2gb of ram. runs perfect at max settings in 1080p resolution.

Right now most games with DX11 effects don't look that much better vs. their DX9/DX10 codepath, yet most have an unpleasant "DX11 performance hit". If you can live with somewhat harsher shadows in games, keep AA to 2-4x and forego Tessellation, then 4870/4890/GTX260/275 GPUs are plenty fast for 1080P. Obviously, if you like to use more advanced AA filters, high resolution texture packs/mods, then you'd want a faster card. There are some intensive games like Metro 2033, AvP, Stalker:COP, Dragon Age II and Witcher 2 that will push modern cards if you crank settings high enough.

To give you an idea of DX11 performance hit, when I switched from 4890 to a GTX470 to play Dirt 2, all that performance advantage was eroded by switching from DX9 to DX11 (but I could hardly notice a difference in actual graphics). I played Dirt 2 at 1920x1080 with everything maxed (with Post Processing to Medium) at 4AA/16AF on my HD4890. My GTX470/6970 can do 8AA/16AF with similar framerates in DX11.

Looks like Dirt 3 will follow a similar performance trend to its predecessor with regard to DX11 performance hit:

GTX275 DX9 ~ GTX570 DX11
HD4890 DX9 ~ HD6970 DX11
HD4870 DX9 ~ HD6950 DX11

^ So you get a 1 generation equivalent performance hit by going with DX11 in Dirt 3. Is the graphical improvement worth it? Probably not worth a $300 graphics card upgrade for this game.

img.php


img.php


Source: GameGPU.ru

Hindsight 20/20, some on our forum doubted if HD5770 would be fast enough in DX11 by the time DX11 games arrive, and purchased an HD4870/4890/GTX260 instead. Looks like we can put those "doubts" to rest:

1080P
5770 = 28 fps
GTX260 = 46 fps
4870 = 51 fps
4890 = 56 fps (2x faster in DX9)

Pretty much if you want to game in DX11, you'd want a $200+ DX11 card, no less. Even HD6870/GTX470 cards can only achieve 45 fps in DX11....
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Just for general information [H] do not change the driver settings - so they do use the lower AMD quality options vs the higher nvidia ones. It took them a while to come out but I do remember a thread a while back in the middle of that IQ argument where they finally admitted it.

This is part of the reason why they tend to get different scores from say xbitlabs which always turn the optimisations off.

The other is [H] obsession with really high resolutions and AA levels which they manage by being happy with comparatively low fps skews their figures towards cards with more memory and good handling for very high levels of AA (i.e. 8* or more).

In the real world most people using the cards reviewed don't have such big monitors and do want a lot higher fps. Views like that will aren't appreciated at [H] however.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
In the real world most people using the cards reviewed don't have such big monitors and do want a lot higher fps. Views like that will aren't appreciated at [H] however.

You could also say most people that play PC Games don't even care about the >$150 cards being reviewed.

Additionally those lower frame rates also come from stressing in game areas so overall game performance will be higher.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Until you show there is in fact any optimizations that are lowering quality in the drivers that were used you're jumping to conclusions. Although, I'm sure that if one site turned off game optimizations that would make a difference for any card, regardless of vendor.

Well like I said above, I was merely postulating. But, I will say that if HardOCP uses the standard settings in the driver panel, then the optimizations are indeed lowering quality in favor of performance, because thats what the optimizations do in this case ie they lower texture filtering quality.

Now whether they are noticeable to the naked eye is another matter entirely, but they should be detectable with testing software.

Anyway, it's not so much the image quality, but the potential performance increase that matters. Nvidia's default settings do not use texture filtering optimizations like AMD's, so benchmarking with AMD's optimizations turned on isn't exactly fair to Nvidia.....especially since Nvidia does have a texture filtering optimization setting, but it's DISABLED by default.

Thats what the commotion was about really, as Nvidia can gain up to 12% in benchmarks with the same optimization, but since it's disabled by default, it makes AMD look better than they really are.

Could it be because hardocp is using their own custom game run through whereas other sites are using the built in demo?

I'm sure this has an impact as well, but I don't think it explains the discrepancy entirely.

Look at the GameGPU benches for instance. Nvidia is ahead by a considerable margin, and I don't think Nvidia has even released optimized drivers for Dirt 3 yet.

Just for general information [H] do not change the driver settings - so they do use the lower AMD quality options vs the higher nvidia ones. It took them a while to come out but I do remember a thread a while back in the middle of that IQ argument where they finally admitted it.

This is part of the reason why they tend to get different scores from say xbitlabs which always turn the optimisations off.

The other is [H] obsession with really high resolutions and AA levels which they manage by being happy with comparatively low fps skews their figures towards cards with more memory and good handling for very high levels of AA (i.e. 8* or more).

In the real world most people using the cards reviewed don't have such big monitors and do want a lot higher fps. Views like that will aren't appreciated at [H] however.

Ah, thanks for confirming what I was thinking...
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
You could also say most people that play PC Games don't even care about the >$150 cards being reviewed.

Additionally those lower frame rates also come from stressing in game areas so overall game performance will be higher.
It's more the difference between playing a game just because it looks pretty vs playing a game because you want to win. [H] go very much for pretty. Particularly for on-line multi-player games people go much more for win. Hence it's not that people won't turn the settings up if they can, but they'd prefer lower settings if that gives them a significantly higher fps and hence more chance of hitting that guy in the distance.

While none of us wants to play ugly looking games (i.e. set settings to low) we can live without AA or occlusion mapping or the highest level of shadows if by doing that our K : D ratio improves and our names are knocked several places up the score charts.
 

WMD

Senior member
Apr 13, 2011
476
0
0
Right now most games with DX11 effects don't look that much better vs. their DX9/DX10 codepath, yet most have an unpleasant "DX11 performance hit". If you can live with somewhat harsher shadows in games, keep AA to 2-4x and forego Tessellation, then 4870/4890/GTX260/275 GPUs are plenty fast for 1080P. Obviously, if you like to use more advanced AA filters, high resolution texture packs/mods, then you'd want a faster card. There are some intensive games like Metro 2033, AvP, Stalker:COP, Dragon Age II and Witcher 2 that will push modern cards if you crank settings high enough.

To give you an idea of DX11 performance hit, when I switched from 4890 to a GTX470 to play Dirt 2, all that performance advantage was eroded by switching from DX9 to DX11 (but I could hardly notice a difference in actual graphics). I played Dirt 2 at 1920x1080 with everything maxed (with Post Processing to Medium) at 4AA/16AF on my HD4890. My GTX470/6970 can do 8AA/16AF with similar framerates in DX11.

Looks like Dirt 3 will follow a similar performance trend to its predecessor with regard to DX11 performance hit:

GTX275 DX9 ~ GTX570 DX11
HD4890 DX9 ~ HD6970 DX11
HD4870 DX9 ~ HD6950 DX11

^ So you get a 1 generation equivalent performance hit by going with DX11 in Dirt 3. Is the graphical improvement worth it? Probably not worth a $300 graphics card upgrade for this game.

img.php


img.php


Source: GameGPU.ru

Hindsight 20/20, some on our forum doubted if HD5770 would be fast enough in DX11 by the time DX11 games arrive, and purchased an HD4870/4890/GTX260 instead. Looks like we can put those "doubts" to rest:

1080P
5770 = 28 fps
GTX260 = 46 fps
4870 = 51 fps
4890 = 56 fps (2x faster in DX9)

Pretty much if you want to game in DX11, you'd want a $200+ DX11 card, no less. Even HD6870/GTX470 cards can only achieve 45 fps in DX11....

Agree. Dx9 - Dx11 is hardly noticeable in most games unless they deliberately disable stuff (that dx9 is capable of) like Ambient Occlusion or Anti Aliasing to make Dx11 standout.

However that chart does not look too accurate with 5870 beating 6950 and 5850 beating 6870
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
It's more the difference between playing a game just because it looks pretty vs playing a game because you want to win. [H] go very much for pretty. Particularly for on-line multi-player games people go much more for win. Hence it's not that people won't turn the settings up if they can, but they'd prefer lower settings if that gives them a significantly higher fps and hence more chance of hitting that guy in the distance.

While none of us wants to play ugly looking games (i.e. set settings to low) we can live without AA or occlusion mapping or the highest level of shadows if by doing that our K : D ratio improves and our names are knocked several places up the score charts.

That depends of the game, doesn't it? And the person (for example I wasn't bothered one bit when AMD didn't have AA for SC2 while others were).

What is the point of having all the reviews look alike?

If [H] benchmarking procedure doesn't satisfy a reader there are others that will, so I don't see what is the obsession people have against sites that don't follow the standard - as long as the procedure is transparent and results aren't being doctored (and I doubt anyone in their right mind believe [H] is falsifying results) the results are valid for the parameters tested.

Also, lets not forget [H] actually does IQ comparisons between AMD and NVIDIA for the games that they use to bench.

For example in the game at hand, Dirt 3, other than the 2560x1600 resolution, the cards have higher than 44 FPS minimum.

Drop a setting here or there and all the cards will be around 50~60 fps average.
 
Last edited:

zebrax2

Senior member
Nov 18, 2007
977
70
91
Right now most games with DX11 effects don't look that much better vs. their DX9/DX10 codepath, yet most have an unpleasant "DX11 performance hit". If you can live with somewhat harsher shadows in games, keep AA to 2-4x and forego Tessellation, then 4870/4890/GTX260/275 GPUs are plenty fast for 1080P. Obviously, if you like to use more advanced AA filters, high resolution texture packs/mods, then you'd want a faster card. There are some intensive games like Metro 2033, AvP, Stalker:COP, Dragon Age II and Witcher 2 that will push modern cards if you crank settings high enough.

To give you an idea of DX11 performance hit, when I switched from 4890 to a GTX470 to play Dirt 2, all that performance advantage was eroded by switching from DX9 to DX11 (but I could hardly notice a difference in actual graphics). I played Dirt 2 at 1920x1080 with everything maxed (with Post Processing to Medium) at 4AA/16AF on my HD4890. My GTX470/6970 can do 8AA/16AF with similar framerates in DX11.

Looks like Dirt 3 will follow a similar performance trend to its predecessor with regard to DX11 performance hit:

GTX275 DX9 ~ GTX570 DX11
HD4890 DX9 ~ HD6970 DX11
HD4870 DX9 ~ HD6950 DX11

^ So you get a 1 generation equivalent performance hit by going with DX11 in Dirt 3. Is the graphical improvement worth it? Probably not worth a $300 graphics card upgrade for this game.

img.php


img.php


Source: GameGPU.ru

Hindsight 20/20, some on our forum doubted if HD5770 would be fast enough in DX11 by the time DX11 games arrive, and purchased an HD4870/4890/GTX260 instead. Looks like we can put those "doubts" to rest:

1080P
5770 = 28 fps
GTX260 = 46 fps
4870 = 51 fps
4890 = 56 fps (2x faster in DX9)

Pretty much if you want to game in DX11, you'd want a $200+ DX11 card, no less. Even HD6870/GTX470 cards can only achieve 45 fps in DX11....

Why are there 5 different type of processor in there as well as 3 different motherboard?
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
For example in the game at hand, Dirt 3, other than the 2560x1600 resolution, the cards have higher than 44 FPS minimum.

Drop a setting here or there and all the cards will be around 50~60 fps average.

The 6950 has min 37, average 40 at 1920p and that's only that high because the game is maxed. If you could apply 48*AA you can bet [H] would have and that min would have been hovering around 30fps.

Even at that higher then normal fps for [H] the problem is an average of 40fps just isn't enough. Sure it's perfectly playable but I have trouble understanding how you can be so sensitive to visuals that you're complaining that nvidia's 8*AA just isn't good enough, yet seem unable to perceive the lack of smoothness that 40fps vs 60 or 120 would give, or be aware of the gaming advantage high fps gives you when trying to play the game seriously.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
The 6950 has min 37, average 40 at 1920p and that's only that high because the game is maxed. If you could apply 48*AA you can bet [H] would have and that min would have been hovering around 30fps.

Check the apples to apples section.

1307949366fUlSyn1A8y_7_2.gif


Even at that higher then normal fps for [H] the problem is an average of 40fps just isn't enough. Sure it's perfectly playable but I have trouble understanding how you can be so sensitive to visuals that you're complaining that nvidia's 8*AA just isn't good enough, yet seem unable to perceive the lack of smoothness that 40fps vs 60 or 120 would give, or be aware of the gaming advantage high fps gives you when trying to play the game seriously.

Maybe because [H] is a Hardware enthusiast site instead of an e-sports enthusiast site?

For serious online gamers the last thing that matters is IQ quality - they will just play at the lowest settings possible, to remove any distractions. Actually many will still be playing old games like SC:BW and CS 1.6 (instead of the better looking CSS). And have you seen the settings pros play SC2 at? Looks worse than Broodwar.

Additionally [H] doesn't say 8xAA is bad - they just show the highest settings a given card will reach and still feel playable to them.

Is it subjective, yes it is and they do say so.

If people don't like their method they are free to ignore it.

So while I don't endorse the rudeness in some replies I see from [H] staff, I don't understand why people go there complaining about their method just because it is a different method or because they prefer higher fps at the expense of IQ.

It is like going to McDonald's and complain their Big Tasty isn't a Whopper.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why are there 5 different type of processor in there as well as 3 different motherboard?

This website combines slower GPUs with slower CPUs and faster GPUs with faster CPUs. Their view is that if you have a mid-range GPU, you probably aren't gaming on a fast Core i5/i7. While I don't entirely agree with their methodology, it shouldn't matter that much as all their CPUs are overclocked to the max. So it virtually guarantees a GPU limited scenario as all of their GPUs are quad-core and overclocked.

So in this case, low- to mid-range GPUs such as GTS450 and HD4870 were tested on Q9550 @ 4.25ghz and on the Phenom X4 940 @ 3.7ghz. Faster GPUs such as the 6950 were tested on Core i5 760 @ 4.2ghz and i7 @ 4.2ghz systems.

Even at that higher then normal fps for [H] the problem is an average of 40fps just isn't enough. Sure it's perfectly playable but I have trouble understanding how you can be so sensitive to visuals that you're complaining that nvidia's 8*AA just isn't good enough, yet seem unable to perceive the lack of smoothness that 40fps vs 60 or 120 would give, or be aware of the gaming advantage high fps gives you when trying to play the game seriously.

That has been my main problem with [H]. While they try to do a proper real world gaming comparison, Kyle fails to understand that a lot of people want 60 fps in racing and FPS games, yet he cranks everything to the max and tests BF:BC2 at 35 fps at 2560x1600 with 4AA. His reviews are great if you want to see how your card performs once pushed to the absolute max (i.e., Do you really need 2GB of ram at 2560x1600 4AA, or will a GTX580 provide a much better gaming experience than an HD6970 with everything maxed?), but for real world gaming, I don't pay much attention to his reviews since his definition of playable framerates isn't aligned with mine. It's good to have more data points though.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
That has been my main problem with [H]. While they try to do a proper real world gaming comparison, Kyle fails to understand that a lot of people want 60 fps in racing and FPS games, yet he cranks everything to the max and tests BF:BC2 at 35 fps at 2560x1600 with 4AA.

Or maybe he just doesn't care?

It isn't like there is no other review sites - I estimate about 124231423534 sites that review 100 resolutions with no more than 4xAA.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Or maybe he just doesn't care?

It isn't like there is no other review sites - I estimate about 124231423534 sites that review 100 resolutions with no more than 4xAA.

^^This.

How many sites do we really need to review cards at 1080 w/4xAA on an O/C'd i7 system before we know the performance of that setup? [H] gives us another perspective. A perspective that, for the most part, is unique. Additional info above and beyond what the mass of other sites give us.

FYI: There are sites that use AM3 too. Just for those people who might have them at home and would like to see how different cards might perform on their systems. ;)
 

zod96

Platinum Member
May 28, 2007
2,872
68
91
I love to see the 5870 in their. This card still kicks some serious butt and its about 1.5 years old....
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
the 4870 still eats almost any game for breakfast at 1080p. i just tried dirt3 on my i3 3ghz machine, with a 1gb 4870 and only 2gb of ram. runs perfect at max settings in 1080p resolution.


From what I've seen, it seems that Radeons tend to age better than their Nvidia counterparts. I would be curious to see how a GTX260/275/280 performs in this game. Maybe they'll match the 4870/4890, but from reviews I see, in general, Radeons seem to perform better than Nvidia parts when past their generation.