[ht4u.net] 55 DirectX 11 graphics cards in the test

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
55 graphics card, 18 game titles and four GPGPU benchmarks - these are the figures of today's mega-test.
Of an elderly GeForce GTX 460 to a brand new GeForce GTX 760 is found practically everything in this comparison again and the price range of models lies antretenden times or even up to 50 ? up to 900 ?.Which card where the race does, clarifies our huge roundup.

About 20 games tested plus a whole suite of GPGPU tests, well worth checking out.

For those who just want the final results:

V7fVjtl.jpg

4bvb.png


Performance index GPGPU computing
eg9c.png


http://www.microsofttranslator.com/...views/2013/55_directx11_grafikkarten_im_test/

http://translate.google.com/transla...views/2013/55_directx11_grafikkarten_im_test/

Games benchmarks (OpenGL)
BRINK

Games benchmarks (DirectX 9)
Alan Wake
Risen 2: Dark waters
The Elder Scrolls: Skyrim
Serious Sam 3 - SFOE
The Witcher 2 - assassins of Kings

Games benchmarks (DirectX 11)
Anno 2070
Assassin's creed III
Battlefield 3
Bioshock: Infinite
Crysis 3
Far cry 3
Dirt: Showdown
Hitman: Absolution

Max Payne 3
Metro: Last light
Sleeping dogs
Tomb Raider (2013)

GPGPU benchmarks (OpenCL / Cuda)
Adobe Photoshop CS6
CLBenchmark 1.1.3
Lux mark 2.0
oclHashcat-lite
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Results seem suspect, some of the tests have a bit too much variance.

Respectable effort, too much of an undertaking though.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
My new card crushes my old card :D Well worth skipping HD 6K series.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
How on Earth is the 7970 GHZ beating the Titan in Bioshock Infinite?

I'm wondering that too. The author of that article told me on another forum that this is not the ingame benchmark but one of the most demanding spots in the game itself. I still kinda doubt the results since every single benchmark on the web says BI runs way better on Nvidia. Question is - do they all bench the internal benchmark? Probably. Anyways, he also told me that this scene is not really representative for the vast majority of the game, so take it with a grain of salt.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
AMD GE?

Look at CoH2, runs horrid on NV GPUs.

55171.png


Other websites tell a different story. That's why it's important to have credible sources when discussing benchmarks. That's not to say that ht4u.net is not credible though, as their benchmarks don't seem particularly fishy to me other than Bioshock Infinite.. I'm sure there's a valid explanation for this..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I'm wondering that too. The author of that article told me on another forum that this is not the ingame benchmark but one of the most demanding spots in the game itself. I still kinda doubt the results since every single benchmark on the web says BI runs way better on Nvidia. Question is - do they all bench the internal benchmark? Probably. Anyways, he also told me that this scene is not really representative for the vast majority of the game, so take it with a grain of salt.

I'd really like to know what scene he was referring to. Did he tell you by chance?
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
No, but I can ask. Just did.
That's what I like about PCGH - they not only precisely document their methodology, upload videos of their benchmark scene and give instructions but they also provide the respective savegames and files for download. This is unique afaik, I have never seen that anywhere else. Other sites should definitely copy this approach, too. What do I really know if an article says "game xyz" is tested? Where, how, is the scene very demanding compared to the rest of the game or not? Is it representative or rather rare?
 
Last edited:

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Bioshock Infinite has some sort of background streaming that uses a lot of VRAM, that might explain the 7970's dominance maybe? Dunno.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I still kinda doubt the results since every single benchmark on the web says BI runs way better on Nvidia.

Not really unless you start comparing cards that don't compete with each other on price. GTX680/770/7970GE are more or less tied in this title in the internal bench.

Bio.png


bioshock_2560_1600.gif
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Not really unless you start comparing cards that don't compete with each other on price.

bioshock_2560_1600.gif

You (intentionally?) misunderstood. At HT4U 7970 GE is faster than Titan, at TPU Titan is 25+% faster. This discussion has nothing to do with price but with discrepancies of benchmark results from different reviews involving the same cards. This can only be explained by

a) different scenes
b) different drivers
c) different approach to turbo benchmarking

HT4U fixes Nvidia cards at their average boost clock and skips AMDs boost as well. TPU does neither. And I guess a) also applies since I doubt TPU uses savegames everywhere.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You (intentionally?) misunderstood. At HT4U 7970 GE is faster than Titan, at TPU Titan is 25+% faster.

I was only responding to your statement that BI runs way faster on NV cards. It doesn't, unless you start comparing a $400 card to a $650 one. The point about a discrepancy in their scores vs. other sites is noted but you already said the author replied to you explaining that he didn't use the internal benchmark which means the scores from HT4U cannot be compared in that game to any other site unless the exact same scene was tested.

Remove that game as an outlier if you want. HD7970GE is still an amazing value despite 7970 being almost 2 years old soon. It is outperforming the 680 at 1440p by nearly 15% and almost any 7970 could reach those clocks as early as January 2012.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Yeah and I obviously meant compared to the HT4U review, since that was the topic. Maybe observe the context next time.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
HT4U fixes Nvidia cards at their average boost clock and skips AMDs boost as well.

Why would they do that? That would significantly cripple the NVidia cards, as the NVidia cards rely much more on boost for extra performance than AMD cards.

How much boost do the AMD cards have? An extra 50mhz? NVidia cards will boost by well over 100mhz on average (with GPU boost 2.0), so artificially limiting their boost capability would not be fair imo..
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Why would they do that? That would significantly cripple the NVidia cards, as the NVidia cards rely much more on boost for extra performance than AMD cards.

How much boost do the AMD cards have? An extra 50mhz? NVidia cards will boost by well over 100mhz on average (with GPU boost 2.0), so artificially limiting their boost capability would not be fair imo..

Because with the default temperature target of 80°C and the relatively slow spinning fan, Titan and to a lesser extent the 780 cannot keep up boost clocks under extended load. At least not in every case/ambient temp/cooling configuration. Short benchmarks allow the cards to cool down in between, thus higher boost will be observed. Thus, at default, Nvidia cards might perform worse during gaming than during benchmarking.

I don't think it's exactly cheating since the downside of a higher boost is significantly higher power consumption and noise. And after all, Nvidia advertises the power and temperature target sliders as adjustable. The whole situation has gotten pretty complicated with the introduction of temperature-dependent boost...results may vary by as much as 20% for Titan (1006 MHz vs 836 MHz) depending on ambient circumstances and boost settings (temp target at default vs temp target at 94°C).

I had my Titans running with the stock cooler and settings for a couple of days. Clocks dropped often below 900 MHz, and that was in an open case! Only with watercooling was I able to sustain 1006 MHz on both cards indefinitely. Not even maximizing the fan and setting the temperature target to 94°C could do that, although I believe that was a bug with the fan control since the second the fan sped up, the clocks went down. That's why many custom 780 models outperform the Titan - their cooling is better, keeping the GPU well below 80°C, thus allowing for maximum boost all the time.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Because with the default temperature target of 80°C and the relatively slow spinning fan, Titan and to a lesser extent the 780 cannot keep up boost clocks under extended load. At least not in every case/ambient temp/cooling configuration. Short benchmarks allow the cards to cool down in between, thus higher boost will be observed. Thus, at default, Nvidia cards might perform worse during gaming than during benchmarking.

It's the reviewers responsibility to make sure that every video card has adequate cooling. I can't imagine it would be difficult to do that given that they were testing in single card configuration. As long as you have proper case ventilation and you're in a decently cooled environment, sustaining boost clocks should be easy with single graphics cards.

Heck, my GTX 770 4GB cards are in SLI, and I have wires everywhere in my case because I have 9 case fans, but my cards are kept cool enough to hit 1241 boost clock and sustain it without messing with any settings..

I understand how frustrating the boost feature must be for reviewers if they want to make sure everything is fair, but I don't think cutting it out is the solution, because it is an important performance feature for NVidia cards.

I had my Titans running with the stock cooler and settings for a couple of days. Clocks dropped often below 900 MHz, and that was in an open case! Only with watercooling was I able to sustain 1006 MHz on both cards indefinitely. Not even maximizing the fan and setting the temperature target to 94°C could do that, although I believe that was a bug with the fan control since the second the fan sped up, the clocks went down. That's why many custom 780 models outperform the Titan - their cooling is better, keeping the GPU well below 80°C, thus allowing for maximum boost all the time.

Titan has a lower power envelope than the GTX 780 as well doesn't it?
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
It's the reviewers responsibility to make sure that every video card has adequate cooling. I can't imagine it would be difficult to do that given that they were testing in single card configuration. As long as you have proper case ventilation, sustaining boost clocks should be easy with single graphics cards.

I disagree. HT4U observed base clocks under high sustained GPU load in a closed case. Computerbase also observed a 10% average difference between default and max power/temp target, as does PCGH. I don't know if that is realistic for an average system or not, but certainly it's much easier to hit 80°C in hot countries or during the summer which would adversely affect performance.

If we continue this path, does the reviewer have to cater to every weakness of the cards involved? I think both settings should be tested, default and max targets, both under realistic conditions. That way every possible case is covered.

Heck, my GTX 770 4GB cards are in SLI, and I have wires everywhere in my case because I have 9 case fans, but my cards are kept cool enough to hit 1241 boost clock and sustain it without messing with any settings..

Well, not everyone has 9 case fans. Then it's a question if they are temperature controlled or not. If your case is insulated or not. If you have lots of other heat sources (highly overclocked CPU).

I understand how frustrating the boost feature must be for reviewers if they want to make sure everything is fair, but I don't think cutting it out is the solution, because it is an important performance feature for NVidia cards.

How does Anandtech benchmark? Open bench table? How long does the average benchmark last and how long until the next one is started? Time to cool down in between. There certainly can be a discrepancy between benchmarking and gaming on some systems when it comes to Boost (2.0). Most reviewers don't consider this at all! I blame Nvidia for setting their temperature target too low and/or for having such a conservative fan curve and/or a crappy cooler.

I wouldn't cut Boost completely out of the picture, but I would test both - default and max target - in a realistic scenario. And that means heat this baby up, so that realistic temperatures apply just like they would during normal gaming.

Titan has a lower power envelope than the GTX 780 as well doesn't it?

I believe so, yes. 250W for both at 100% and 265W at 106%.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm wondering that too. The author of that article told me on another forum that this is not the ingame benchmark but one of the most demanding spots in the game itself. I still kinda doubt the results since every single benchmark on the web says BI runs way better on Nvidia. Question is - do they all bench the internal benchmark? Probably. Anyways, he also told me that this scene is not really representative for the vast majority of the game, so take it with a grain of salt.

Why would any reviewer do a review that wasn't indicative of the actual performance, and then tell you to take his review with a grain of salt? Basically saying to ignore his review?