[ht4u.net] 55 DirectX 11 graphics cards in the test

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah and I obviously meant compared to the HT4U review, since that was the topic. Maybe observe the context next time.

Before being condescending, you might want to re-read your own statements and my responses to the statement you made. That's context.

I still kinda doubt the results since every single benchmark on the web says BI runs way better on Nvidia.

The statement you made does not strictly refer to HT4u's testing of BI. While it is true that the results at HT4u for this title are an outlier from the rest of the net, my reply was specifically to the incorrect statement you made.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Why would any reviewer do a review that wasn't indicative of the actual performance, and then tell you to take his review with a grain of salt? Basically saying to ignore his review?

I agree insofar as demanding scenes matter more than less demanding ones. If a card performs fast enough in a demanding scene, it will perform well enough in the whole game. Iirc, he said this scene is representative for 15-20% of the game.
Many people think highly of minimum fps. That's exactly the same concept.

@RS:
I know of no benchmark except HT4U where the 7970 GE is faster than Titan. Not one. You just wanted to take the opportunity to slip a little price/perf discussion in here, but that is OT and you know it. That is why I was condescending, because I hate you bringing your agenda in every thread at every opportunity.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That's why many custom 780 models outperform the Titan - their cooling is better, keeping the GPU well below 80°C, thus allowing for maximum boost all the time.

Exactly. Everyone that was super impressed by the Titan's beautiful cooler forgot that Boost is impacted by GPU temperature. This is where after-market dual-slot air cooling solutions are superior to the blower fan designs.

It's the reviewers responsibility to make sure that every video card has adequate cooling.

No, it is the responsibility of NV/AMD and their AIBs. If the GTX780/Titan's stock cooler cannot cope with the demands of Boost 2.0, then review should actually report those findings. Reviewers often report that while cards can sustain 1Ghz boost, over extended gaming, the cards overheat and 780/Titan cannot boost to those same levels over time.

This is why people buy after-market 780s or go water.
http://www.computerbase.de/artikel/...it-geforce-gtx-780-super-jetstream-im-test/2/

That is why I was condescending, because I hate you bringing your agenda in every thread at every opportunity.

I simply pointed out that HD7970 series has held up really well. The results of this review obviously differ from the rest of the net since they average out SSAA and 8xAA with no AA. You seem very defensive whenever NV cards aren't shown in the best light in gaming benches or when the Titan barely beats 7970GE.

There are many other titles HT4U could have included where 680 is not pulling its weight. So many people were defending 680 vs. 7970GE for the last 12 months and after 770 came out, the focus is now on 770 vs. 7970GE. Looking back at 680 vs. 7970GE, when 7970GE outperforms the 680, it does so by large margins. It's rare for 680 to outperform the 7970GE by 30-35%.

d%202560.jpg
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I simply pointed out that HD7970 series has held up really well. The results of this review obviously differ from the rest of the net since they average out SSAA and 8xAA with no AA. You seem to have a problem whenever NV cards aren't show in the best light in gaming benches.

Thanks for editing your post and striking parts before I can respond...
You did in fact bring price/performance into this discussion:

I was only responding to your statement that BI runs way faster on NV cards. It doesn't, unless you start comparing a $400 card to a $650 one.

I have no problem with NV not being in the best light. I have a problem with people that threadcrap with OT comments like you did. And when this leads to complaints, you indirectly label the posters fanboys. I will have non of that. Stay on topic. And btw...if I had the problem you speak of, would I have pointed out the things I did? Certainly not.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Why would any reviewer do a review that wasn't indicative of the actual performance, and then tell you to take his review with a grain of salt? Basically saying to ignore his review?

There are more problems here too, like 1GB cards and their 2GB brothers not having consistent results across the board.

Like I said, I respect the effort but the results just aren't consistent enough to take seriously.

I applaud them for the effort, but if you're going to run non canned benchmarks you need to be capable of reproducing the runs exactly each and every time.

It doesn't seem like that was the case here.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
There are more problems here too, like 1GB cards and their 2GB brothers not having consistent results across the board.

Like I said, I respect the effort but the results just aren't consistent enough to take seriously.

I applaud them for the effort, but if you're going to run non canned benchmarks you need to be capable of reproducing the runs exactly each and every time.

It doesn't seem like that was the case here.

Does your experience stay the same throughout normal gameplay? You can make arguments for all types of benchmarking, but runs like this are far more indicative of true performance than in game benchmarks where both vendors optimize the crap out of their drivers to paint their gpu's in the best light.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
In the many SP tests they should be making it a point to produce mirror runs.

Just look at the 7950, 800MHz vs reference boost 850 base, 925 boost.

16% overclock on the 7950 results in a 1% performance gain, and is often slower than the 800MHz card...

No credibility, sorry.
 

Makaveli

Diamond Member
Feb 8, 2002
4,960
1,556
136
Bioshock Infinite has some sort of background streaming that uses a lot of VRAM, that might explain the 7970's dominance maybe? Dunno.

Only problem with this is Titan has more Vram than the Ghz.

6GB vs 3GB.

They may have just tested in an area of the game where the radeon is faster it happends, it doesn't mean the radeon is faster overall in the game.
 
Feb 19, 2009
10,457
10
76
In the many SP tests they should be making it a point to produce mirror runs.

Just look at the 7950, 800MHz vs reference boost 850 base, 925 boost.

16% overclock on the 7950 results in a 1% performance gain, and is often slower than the 800MHz card...

No credibility, sorry.

Balla if you were around the time AMD release the boost bios for reference 7950, you will realize it did squat all because the TDP skyrocket due to the 1.25vcore, meaning, these boost cards actually throttled. It was stupid, requiring users to up the power limit in CCC.

It wasn't until their partners released custom boost cards that performance between ref and boost models became differentiated.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I never saw a review even with those cards that was slower than the reference 800MHz cards, until this one.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
The 5870 beat the GTX480 in compute? Wow. The 5870 really was a great card. And compared to the more costly options, the 7970 holds its own.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,039
2,251
126
Thanks for editing your post and striking parts before I can respond...
You did in fact bring price/performance into this discussion:

While the statement about $400 and $650 cards could be construed as trying to go off topic, to me his response was directly to your "I still kinda doubt the results since every single benchmark on the web says BI runs way better on Nvidia." Your statement is incorrect based on the benchmarks linked to in this thread, assuming you are talking about cards that actually compete with eachother (ie. obviously a GTX770 will be faster than a AMD 7850, etc), which is what he was trying to get at I think.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
While the statement about $400 and $650 cards could be construed as trying to go off topic, to me his response was directly to your "I still kinda doubt the results since every single benchmark on the web says BI runs way better on Nvidia." Your statement is incorrect based on the benchmarks linked to in this thread, assuming you are talking about cards that actually compete with eachother (ie. obviously a GTX770 will be faster than a AMD 7850, etc), which is what he was trying to get at I think.

Well, then you guys assume wrong ;)
I made a general statement, thus involving the whole product spectrum of both IHVs. Why would I try to compare specific competitors when pricing is dependent on country, special deals, bundles, memory amount etc.? Next time I'll be more specific in my wording if I remember it, so Russian has no pretext to go off topic again. Still, in the context of the posted results from HT4U it was clear what I meant.
 

Peter HT4U

Junior Member
Jul 19, 2013
1
0
0
I'd really like to know what scene he was referring to. Did he tell you by chance?

Sorry, that's no secret - you could read it in the BI-Chapter of the article ;)

It's the chapter Finkton Proper and the direct scene in the elevator, after Level-Loading. A screenshot is shown too in the article.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I disagree. HT4U observed base clocks under high sustained GPU load in a closed case. Computerbase also observed a 10% average difference between default and max power/temp target, as does PCGH. I don't know if that is realistic for an average system or not, but certainly it's much easier to hit 80°C in hot countries or during the summer which would adversely affect performance.

Here's the Anandtech GTX 780 review:

Click here

It shows the average boost clock speeds for both the GTX 780 and the Titan. I'm seeing a maximum of 3.8% for the 780 and 2.7% for the Titan in terms of the difference between the highest and lowest clock speeds..

If PCGH is seeing 10% differences, then they need to rethink either their test environment, methodology or configuration..

If we continue this path, does the reviewer have to cater to every weakness of the cards involved? I think both settings should be tested, default and max targets, both under realistic conditions. That way every possible case is covered.

This would certainly be better than cutting boost completely out of the picture.

How does Anandtech benchmark? Open bench table? How long does the average benchmark last and how long until the next one is started? Time to cool down in between. There certainly can be a discrepancy between benchmarking and gaming on some systems when it comes to Boost (2.0). Most reviewers don't consider this at all! I blame Nvidia for setting their temperature target too low and/or for having such a conservative fan curve and/or a crappy cooler.

I have no idea how Anandtech benchmarks, but given that they aren't see anywhere near a 10% deviation in average clock speeds for the Titan and GTX 780, there must be some salient differences in their test methods compared to HT4U and PCGH.

I wouldn't cut Boost completely out of the picture, but I would test both - default and max target - in a realistic scenario. And that means heat this baby up, so that realistic temperatures apply just like they would during normal gaming.

We are all aware though that electrical components such as video cards perform and run better in cooler environments. That's why we spend lots of money on various cooling solutions.

Anyone that can afford a GTX 780 or Titan should know not to run the card in a cramp, warm environment with little ventilation.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
No, it is the responsibility of NV/AMD and their AIBs. If the GTX780/Titan's stock cooler cannot cope with the demands of Boost 2.0, then review should actually report those findings. Reviewers often report that while cards can sustain 1Ghz boost, over extended gaming, the cards overheat and 780/Titan cannot boost to those same levels over time.

Here is the Anandtech review of the GTX 780 with the Titan for comparison:

Click here

Look at the average clock speeds. There isn't nearly as much difference as what boxleitnerb was reporting, which to me indicates that Anandtech is running the benchmarks and tests in a cool environment, as they should..

There are many other titles HT4U could have included where 680 is not pulling its weight. So many people were defending 680 vs. 7970GE for the last 12 months and after 770 came out, the focus is now on 770 vs. 7970GE. Looking back at 680 vs. 7970GE, when 7970GE outperforms the 680, it does so by large margins. It's rare for 680 to outperform the 7970GE by 30-35%.

It's obvious from looking at that benchmark that NVidia hasn't released optimized drivers for that particular game, as SLI isn't working.

I agree though that AMD seems to be better overall when it comes to optimizing their drivers over a wider segment of games. NVidia seems to focus mostly on the games that draw the most attention, or that most PC gamers will play.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Until we know how Anandtech tests, this discussion is a little pointless. From first hand experience with stock Titans in an open case and watercooling for the CPU (external rad), I didn't reach full clocks all the time during gaming (not benchmarking!). So it doesn't take a worst case scenario for Titan to "throttle".
In most if not all reviews Titan reaches 80°C exactly under load which brings me to the conclusion that there is less boosting going on in those cases. So are they all doing something wrong?

Point is, benchmarks are relatively short and it takes time for the GPU to heat up and cool off. If it starts at 30°C from idle, clocks are high and once it hits 80°C, clocks will go down. But by then, the benchmark might already be over or partially over. Thus you get higher clocks than you would get in "real" gaming sessions for extended periods of time. Of course with aggressive cooling you might keep this from happening, but why should the reviewer test a best case?

When Nvidia did bad at 8xMSAA, it was still tested. When AMD did bad at tessellation, it was still tested. And now when Nvidia does bad at higher temperatures (which can occur), it should be tested as well. We should not let IHV dictate testing conditions so their product comes out favorably.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Sorry, that's no secret - you could read it in the BI-Chapter of the article ;)

It's the chapter Finkton Proper and the direct scene in the elevator, after Level-Loading. A screenshot is shown too in the article.

OK thanks. I don't see how an elevator scene would be representative of actual gameplay though, seeing as you spend a small amount of time in elevators in the game.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I have no problem with NV not being in the best light. I have a problem with people that threadcrap with OT comments like you did. And when this leads to complaints, you indirectly label the posters fanboys. I will have non of that. Stay on topic. And btw...if I had the problem you speak of, would I have pointed out the things I did? Certainly not.

You are still not following my point. You continue to see bias. You made a statement that BI runs faster on NV cards in every benchmark on the web. I responded clearly that this is false, unless you start comparing cards from different price classes. If you do that, then yes your statement is true. In other words if you compare $400 770 to HD7970GE, they are roughly equal. Your statement only applies once we start comparing cards in different price categories, for example $400 7970GE to say $650 GTX780. Instead of reading my comments and following the context based on the erroneous statement you have made, you continue to focus on price/performance and making up things out of thin air such as me thread-crapping without realizing that the statement you made was 100% wrong. The only reason I brought up prices is because at similar price levels, NV's cards aren't faster in BI.

Generally speaking, it's a good idea to look at 5-6 reviews that benchmark a particular title and hopefully if they are using the latest driver, you can look at a trend. Comparing numbers directly is not always possible because some reviewers may enable DOF in BI and some will not, etc. It's often difficult to find extensive reviews with so many videocards. This review at least serves those with older cards to see how much faster new gen cards are. I think Computebase does a very excellent job of including many cards in their reviews. You can often just use a previous gen card to estimate where your old card would end up (say take GTX580 and remove 15% if you have a 480).
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
In the context of the HT4U review and their BI benchmark being an outlier, my statement is right. I never ever mentioned anything about similar price levels or price/perf. Therefore, when I said "Nvidia cards" I meant in general, all cards. Product stack vs product stack, fastest vs fastest. Thus my statement was not erroneous, you just didn't understand it right because you ignored the context in which it was made and meant.

Let's quit this bickering now, no good will come of it. Just stay on topic.
 
Last edited:

The Alias

Senior member
Aug 22, 2012
646
58
91
In the context of the HT4U review and their BI benchmark being an outlier, my statement is right. I never ever mentioned anything about similar price levels or price/perf. Therefore, when I said "Nvidia cards" I meant in general, all cards. Product stack vs product stack, fastest vs fastest. Thus my statement was not erroneous, you just didn't understand it right because you ignored the context in which it was made and meant.

Let's quit this bickering now, no good will come of it. Just stay on topic.

what's the point in comparing fastest vs fastest when that's not even relevant to most consumers . you should compare cards in their own respective price brackets
 

omeds

Senior member
Dec 14, 2011
646
13
81
Erm, of course the faster cards will cost more.. :whiste:

Product stack vs product stack is perfectly valid.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
Erm, of course the faster cards will cost more.. :whiste:

Product stack vs product stack is perfectly valid.

but you can't compare a 7970 vs a 780 and say Nvidia is faster in this game because when you say that most people (me included) take that as nvidia is better in all price brackets . which may or may not be true, BUT comparing two cards in two seperate price brackets is NOT the way to find that out .