[H] HD 7970 Dual-X Review

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Not sure. I just used Precision, and other than removing the brief period of voltage fluctuation you can see in my Afterburner graphs above, it still only gave me two states. Idle was the normal 0.987V (different from 1.1V idle with Afterburner open), with load at 1.175V.

Also, Precision got updated to 3.0.2, and the voltage tuning app did too. It doesn't go to 1.2xV anymore, tops out at 1.175V now.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Misleading. Anyone who buys a 680 is guaranteed a base clock of 1006MHz for reference model. There isn't any guarantee of how much boost there is for any given application though. You are guaranteed a minimum and that is all that's warranted. Anything else is a bonus.
By what I am reading, and by these standards, the GTX680 can never really be benched according to some here..

It's not misleading Keys because you're misunderstanding my point. I don't think users should have some kind of recourse with Nvidia if their cards don't boost as high as someone else's. Nvidia doesn't guarantee anything beyond 1006Mhz as you've said. I'm saying it makes benchmarking difficult because stock 680's are now YMMV. Out of the box performance is variable. Just like overclocking...

GPU Boost is a nice feature but it would be nice if you could disable it.

I get your point of view. The 680 is YMMV above 1058mhz. But all of them should boost to at least 1058mhz. I think it's reasonable to expect some 680 to get to 1084-1110 (or w/e the proper 13mhz increments are). Some might not. I don't think it's fair to say that reference 680s will go to 1200mhz frequently. That seems like an outlier in HardOCP's review.

Look at Computerbase. Their reference card hovered between 1033 and 1097 mhz.

I think you're right that a majority of cards will boost somewhere less than 1200Mhz (probably closer to 1100) but it still makes comparisons difficult. I like the GPU Boost feature but I wish it could be disabled for more meaningful benchmarks. That way reviewers could show baseline at 1006Mhz, stock with GPU Boost to show what most users will experience, and oced to the max.

If you actually look at the benchmarks, for the games you personally play, it will become clear that in games where HD7970 is faster, 1097 isn't enough to catch up to the 7970. While in games where GTX680 is tangibly faster, it would require a 20% overclock to the 7970 to just match the 680 in those instances. So even despite dynamic OCing, the conclusion doesn't really change. In games where HD7970 is faster (Metro 2033, Crysis 1/warhead, Anno 2070), an overclocked 680 won't ever beat an overclocked 7970. In games where 680 is faster, an overclocked 7970 won't beat an overclocked 680 either.
Problem is that if some reviews have benchmarks of a 680 running at a max boost of 1058Mhz and others at 1200Mhz (seems only [H]'s went that high from the 5-6 reviews I read) than results could vary by 10-13%. The games where the 680 had a sizable lead could shrink to a few percentage points depending on the card reviewed. Same goes for the games the 7970 was leading in. If it was benched against a card that only boosted to 1058Mhz it might not accurately portray the performance the average end-user would see.

Because GTX680s performance varies more, we can just take more reviews into consideration to reduce the margin of error.
That is probably the best way to get an accurate picture. If you can find enough reviews that test the same games at the same settings, you should be able to figure out how the two cards compare.

In your case, this shouldn't even be an issue since you have a huge overclock on your 7970 that 680 won't match on reference coolers. It'll be a long time before anything much faster than an HD7970 @ 1375mhz comes to the market. You might even be able to skip HD8790. Think about it, if AMD increases SPs to 2304 and raises clocks to 1200mhz, that would only match your card right now. :D
Ya, I'm not to worried about switching now. I'll wait for the next gen to come along and see what both teams have to offer. Hopefully it's a nice increase over the current gen.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I think you're right that a majority of cards will boost somewhere less than 1200Mhz (probably closer to 1100) but it still makes comparisons difficult. I like the GPU Boost feature but I wish it could be disabled for more meaningful benchmarks.

In theory it could, but it wouldn't represent stock performance anymore because they all boost to at minimum the published boost clocks it seems.

All you have to do is find your max default boost (1097/1098 for mine for example) and set your boost offset to the difference between that and 1006. (so -91 or -92 for mine). That should lock it down to 1006 Mhz, but no card is going to perform like that and you've in effect underclocked it by doing that. Looks like you'll be shaving off around 10% of the average clock speed if you do that.
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,169
829
126
In theory it could, but it wouldn't represent stock performance anymore because they all boost to at minimum the published boost clocks it seems.

All you have to do is find your max default boost (1097/1098 for mine for example) and set your boost offset to the difference between that and 1006. (so -91 or -92 for mine). That should lock it down to 1006 Mhz, but no card is going to perform like that and you've in effect underclocked it by doing that. Looks like you'll be shaving off around 10% of the average clock speed if you do that.

I wouldn't want GPU Boost to be disabled for the whole review as that wouldn't show performance as the end-user would see it. I think separate benchmarks showing baseline and then with GPU Boost would be handy along with some graphs of average boost levels while gaming. It would give a clearer picture of the average card's performance.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I wouldn't want GPU Boost to be disabled for the whole review as that wouldn't show performance as the end-user would see it. I think separate benchmarks showing baseline and then with GPU Boost would be handy along with some graphs of average boost levels while gaming. It would give a clearer picture of the average card's performance.


When I get home this afternoon I'll test the theory and see if that locks the card to the minimum clock, just to see.
 

Zargon

Lifer
Nov 3, 2009
12,218
2
76
All benchmarks have a variation of 1-3% as a result of margin of error. The fact that NV's GPUs Boost out of the box is how the consumer will get those cards. Therefore disable GPU Boost is counter to how 100% of GTX680 owners will use their cards.

On the other hand manual overclocking on an HD7970 or GTX680 is not something each user will perform. The fair comparison would entail:

1) Out of the box GTX680 vs. out of the box HD7970 with no manual overclocking at all

OR

2) Manual overclocking on GTX680 vs. manual overclocking on HD7970.

The baseline performance for any product is what performance the user will get after he takes that component out of the box and puts it in his/her system. Any speed above that speed which results from manual adjustments is overclocking.


GPU boost is a random variable being added to a benchmark that makes it unreliable. its not going to be the same for all cards, let alone in all games, so it simply isnt acceptable to include in a baseline

A baseline is a line that is a base for measurement or for construction; see datum (calculations or comparisons) or point of reference (engineering or science).

including a variable you know will cause random changes in performance should not be used in a baseline.

for 680 benches they should be running it with boost disabled, and with boost enabled, and clearly label it. that is all I am saying. nothing more. any resistance to that idea is either ignorance to stats, general ignorance, or fanboyism for the current Nvidia cards.


what ANY USER is going to get out of the box, is the stock clock with no boost, as boost speed isnt guarenteed.

Just wait until AMD implements the same feature and then all these arguments will be put to the backside. Dynamic overclocking is the future because it allows for a better balance between performance and power consumption.

no it wont because there will even MORE variation leading to less usefull benchmarks. it wont fix the 'problem' it will make things even more clouded for the consumer, and seeing how bencher's are responding by soaking in the koolaid on this one, its going to get EVEN WORSE




that being said I will state the obvious AGAIN, like I did in the 680 review thread.

the 680 is hands down king. comprable/better performance, at a lower heat and power threshold. the tables have turned on AMD after the last 2 gens

since perf/$ with heat/power in consideration is how I buy cards.

but since I am knee deep in 69xx cards right now I am going nowhere
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
GPU boost is a random variable being added to a benchmark that makes it unreliable. its not going to be the same for all cards, let alone in all games, so it simply isnt acceptable to include in a baseline

Of course it can and real simple:

Default vs Default

Over-clocked vs Over-clocked.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I guess we'll chalk it up as the downside of having a vastly superior design when compared to powertune.

Do you have any idea what you, or we, are talking about? :confused:

amd fan bois reaching

How about adding to the discussion, troll.

When I get home this afternoon I'll test the theory and see if that locks the card to the minimum clock, just to see.
Please do. I'd be curious to see what you find.

I'm going to try this too, I'm curious.

Edit: -100 offset locked me to 1006 for almost an entire 3DMark run, with what I guess is a boost to 1013.

boostdisabled.png



Battlefield 3 gaming was locked at 1006.

bf3boostdisabled.png
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Of course it can and real simple:

Default vs Default

Over-clocked vs Over-clocked.

The 680 default will vary, is what they are discussing. AKA HardOCP golden sample vs a just hits minimum boost requirements card.

If people want good card information this is definitely adding more work onto reviewers.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
The 680 default will vary, is what they are discussing. AKA HardOCP golden sample vs a just hits minimum boost requirements card.

If people want good card information this is definitely adding more work onto reviewers.

Understood, but it can't be shut off. What are we discussing?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
for 680 benches they should be running it with boost disabled, and with boost enabled, and clearly label it. that is all I am saying. nothing more. any resistance to that idea is either ignorance to stats, general ignorance, or fanboyism for the current Nvidia cards.

OK look at some of the games where GTX680 leads. Does it look like a 3-4% variation will change the conclusion of how fast GTX680 is in those games?

45150.png


I doesn't change anything. Where GTX680 leads, the lead is massive. Just like CPUs should be tested with CPU boost, GPUs should be tested with all the features turned on because that's what the average end user will get out of the box.

It's no different than Nissan GTR's engine is not exact out of the factory since the engine is hand assembled. No 2 Nissan GTRs will ever produce the exact same horsepower. Does it change the conclusion that Nissan GTR is insanely fast? No, it doesn't. If you bought a Nissan GTR and it produced 550hp but your buddy's GTR put down 540 hp, and Nissan rated GTR at 540 hp, are you going to complain? NV rates GTX680 at 1058mhz and everything after is pure bonus.

If HD7970 could automatically scale from 925mhz to 1.2ghz, let it scale. Manual overclocking is a far bigger luck of the draw than GPU boost is because the % variation among manually overclocked HD7970 cards appears to be far greater than the % variation in GPU boost among GTX680s. So if anything, GTX680s overclocking is more consistent. Some 7970s crap out at 1125-1175mhz, while almost all 680s will do 1200mhz, easy.

There was almost never a baseline performance to begin with. So that statement is a fallacy. Our systems are not identical to those used in reviews. At best we can get an idea of how particular hardware will perform in a particular system, but we cannot expect identical results since there is still some variation. The variation in GTX680 reviews is not enough to change the conclusion whatsoever.

That's why we often look at average performance to minimize outliers. By looking at a large sample size of reviews, we minimize outliers in reviews.

Do you have statistical data to show that the variation among GTX680's GPU Boost materially impacts the conclusion of how GTX680 performs against HD7970? It would only be imperative to disable this feature if there was a material impact on the actual user experience that GTX680 cards bring. However, reviews consistently show that GTX680 easily outperform a stock HD7970 and that an overclocked HD7970 cannot outperform an overclocked 680.

If you look at 20-30 reviews, nothing changes about the final conclusion. If you are concerned with exact testing methodology, using the scientific method by keeping all variables constant, then you'd have a problem with the GTX680. We aren't doing a lab experiment here to achieve perfection or isolate variables to look at cause and effect relationships or derive a formula with some relationships. If someone asked you what card you'd recommend right now at $500, what would you say HD7970?

Videocard testing is itself subjective since the testing methodologies, particular parts of the games tested, the hardware used in each review themselves vary among reviewers. At best we can extrapolate how cards stack up to each other in that particular review and then compare those results across many other review samples to derive a broader conclusion.

If anything, GTX680 signals that reviews should do more manual game benchmarking and rely less on canned benchmarks. So in effect more than ever the only way to tell how well a GPU runs is by actually running a 2-10 minute sequences benchmark.

Any slight GPU boost variations will be negated over a prolonged testing sequence.

Given that across 20 or so reviews the conclusion is very much still the same regarding GTX680 vs. HD7970 or OCed 680 vs. OCed 7970, it appears you are more upset about the consistency of benchmarking being YMMV.

I don't think we sit there and scrutinize reviews and say Oh in this review Crysis 2 got 67 fps and in this review the card got 63 fps. No one expects exact performance. We look to reviews for guidance, not mathematical certainty and proof. This isn't the Large Hardon Colider where the accuracy of results is paramount.

It's already assumed that all reviews on the Internet are +/-5% within a margin of error.

Your dilemma is 2-fold:

- You are bothered by Dynamic OCing of GTX680 since not every GTX680 will Boost the same way

- But at the same time, not every HD7970 can operate at 1280mhz with overclocking either. Since manual overclocking is not guaranteed either, you are still left with a variation in what you'll get as an end user.

Therefore, it's unrealistic to expect the utmost accuracy in these type of reviews. Just use your judgement, read a lot of reviews, see if in general the card in question is "faster" or "slower".
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The 680 default will vary, is what they are discussing. AKA HardOCP golden sample vs a just hits minimum boost requirements card.

If people want good card information this is definitely adding more work onto reviewers.

I believe good information is out there and the market will decide.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
There isn't any guarantee of how much boost there is for any given application though. You are guaranteed a minimum and that is all that's warranted. Anything else is a bonus.

Going by that, the boost IS overclocking then, so WHY NOT compare an OCed 7970 to a boosting 680?

If you want to look at the 680 as just downclocking itself rather than upclocking, why aren't all cards rated at the same maximum? Instead we have the same MINIMUM clock and is raised, which means it is overclocking.

Just sayin.

I personally think the 680 is the better card, especially if it is cheaper...but this boost thing has really muddied the waters, and has given us lots more to argue about. :D

EDIT: Just a thought about the variance in 7970 overclocks. What is the general average on stock volts? 1100MHz? 1150MHz? If there is something like a consistent average OC for 7970s, that value should be included too in benchmarks.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Going by that, the boost IS overclocking then, so WHY NOT compare an OCed 7970 to a boosting 680?

Out of the box is out of the box and there is NO getting around that fact, AND the fact that both 7970 and 680 can be o/c'd on top of their out of the box settings. That is all that can be said fairly IMHO. When AMD does the same thing as Nvidia did with the 680, I'll say the same thing.
 

Zargon

Lifer
Nov 3, 2009
12,218
2
76
Of course it can and real simple:

Default vs Default

Over-clocked vs Over-clocked.

nvidia list the base clock at 1006, so thats stock then right? :)

I doesn't change anything. Where GTX680 leads, the lead is massive. Just like CPUs should be tested with CPU boost, GPUs should be tested with all the features turned on because that's what the average end user will get out of the box.

its already been pointed out how CPU turbo boost and GPU boost are different. ie locked vs variable

It's no different than Nissan GTR's engine is not exact out of the factory since the engine is hand assembled. No 2 Nissan GTRs will ever produce the exact same horsepower. Does it change the conclusion that Nissan GTR is insanely fast? No, it doesn't. If you bought a Nissan GTR and it produced 550hp but your buddy's GTR put down 540 hp, and Nissan rated GTR at 540 hp, are you going to complain? NV rates GTX680 at 1058mhz and everything after is pure bonus.

no but if its supposed to have 540whp, then the test should be ran with the 540 hp one that most people are likely to get

car makers were famous for sending 'factory freaks' to tests and consumer got pissed when 90% of the cars on the lot wouldnt produce the same times in the 1320. so they largely quit.

if you take [H]'s card for example, it was significantly faster than others being tested. so if they cherry pick that review and push it everywhere, you are likely not buying what is being advertised.

If HD7970 could automatically scale from 925mhz to 1.2ghz, let it scale. Manual overclocking is a far bigger luck of the draw than GPU boost is because the % variation among manually overclocked HD7970 cards appears to be far greater than the % variation in GPU boost among GTX680s. So if anything, GTX680s overclocking is more consistent. Some 7970s crap out at 1125-1175mhz, while almost all 680s will do 1200mhz, easy.

MOST 680's do 1200 now?

Your dilemma is 2-fold:

- You are bothered by Dynamic OCing of GTX680 since not every GTX680 will Boost the same way

- But at the same time, not every HD7970 can operate at 1280mhz with overclocking either. Since manual overclocking is not guaranteed either, you are still left with a variation in what you'll get as an end user.

Therefore, it's unrealistic to expect the utmost accuracy in these type of reviews. Just use your judgement, read a lot of reviews, see if in general the card in question is "faster" or "slower".

sorry i yannked a ton and wont respond due to time.

for the 2 fold dilema, there is another factor. when seeing an OC review for the 7970 its clear its OC'd.

too many people are taking #'s from outliers like the 1200 @ [H] and calling that a normal stock card, even though its 5-10% faster than other reviews out there.

yes its just 10%, but where do we draw the line? hopefully while we can still see where it should be.

of course the real manual benchmark vs canned is a whole OTHER slice of pie


I believe good information is out there and the market will decide.

yeah that doesnt work most places why would it work here?
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Please do. I'd be curious to see what you find.

I had to play with it to find the right offset, but on mine -90 makes it idle at 993 Mhz, then sit at 1006Mhz the entire time it's under load. -92 made it act a little weird and it seemed stuck on 980Mhz the whole time, but it does appear you can lock it to 1006 under load if you want. The auto-voltage seemed a little confused and was giving it more voltage than I would expect (1.162 or 1.175V), but the clocks were locked down.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Out of the box is out of the box and there is NO getting around that fact,

Fair enough but out of the box is YMMV it seems for the 680, although probably not a drastic difference between the best and worst.

So how do you make a fair comparison with YMMV muddying it all? There really should be more control of the boost feature.