Sapphire 7970GE Toxic Review

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
We need to wait for sites that do a better job of real world results to find out about the power usage and noise. 3DMark with the mic 2cm from the top of the fan and measuring power draw at the wall isn't the best way to do it. TechPowerUp will measure the noise at 1mtr while gaming. They will also measure min, max, max gaming, etc. power usage of the card itself. Then we'll have a better idea.

I'm pretty certain that the 7970GE @1200MHz will not sip power under load. I'd be real surprised though if the Dual-X cooler isn't actually quite quiet in real world usage.
Yeah i hear ya,best cooler ive messed about with.shockingly quiet under load.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
KitGuru has also done a review. Updated the OP.


From the review:
<snip>
Until today, the incredible KFA2 GTX680 Limited OC Edition claimed the ultimate single GPU performance spot, however in the majority of the real world game testing, the Sapphire HD7970 6GB Toxic Edition managed to outperform the overclocked GTX680. The reference clocked GTX680 doesn’t even factor into making a viable challenge.

The card does use A LOT of power. If they had tried to make this card on 40nm it would have been ugly (uglier?). It is nowhere as loud as in TT's review. Measuring at 1mtr instead of 2cm makes a world of difference, and is much more realistic.

Anyway, this back and forth competition for the top spot (single GPU, of course) is entertaining. I wonder if nVidia will have a response? Or will it simply be to point out they are more efficient? Which is always a good thing. Ask AMD for the last couple of years. :p
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
KitGuru has also done a review. Updated the OP.


From the review:
<snip>


The card does use A LOT of power. If they had tried to make this card on 40nm it would have been ugly (uglier?). It is nowhere as loud as in TT's review. Measuring at 1mtr instead of 2cm makes a world of difference, and is much more realistic.

Anyway, this back and forth competition for the top spot (single GPU, of course) is entertaining. I wonder if nVidia will have a response? Or will it simply be to point out they are more efficient? Which is always a good thing. Ask AMD for the last couple of years. :p

I agree, but the quality of case you have could make quite a bit of difference as to how much noise you hear from the card at that distance.

Doesn't Nvidia technically retain the crown here though? Or am I mistaken that third party overclocked cards don't count? It is one hell of a card though, if I was thinking of buying a new high end card then it would probably have to be this one.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Doesn't Nvidia technically retain the crown here though? Or am I mistaken that third party overclocked cards don't count? It is one hell of a card though, if I was thinking of buying a new high end card then it would probably have to be this one.

If you want to exclude third party OC'ed models, the stock ref AMD 7970 GE is still faster than the stock ref GTX 680 by a hair.

AMD took that crown, all nVidia has to do is release an new stock "OC'ed" ref model or increase GPU boost ceiling through a software patch.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The difference between 680 and 7970GE / 7970OC is ridiculously small, its great that the competition is so close to each other. I don't think its been this close in a long time in terms of performance...

Nvidia has definitely done a better job at winning consumer mind share though, they did a great job marketing the cards and getting word out in every way possible.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Looking at Kitguru's review I yet again look with fear of what programs like Furmark can draw out of a card. I think Furmark hit the last nail in the coffin for my dead GTX 570.

I would not recommend anyone to run this kind of stability testing on their cards. OCCT GPU test is almost/if not as bad as Furmark.

Instead use demanding games for stability testing graphics cards.

power-consumption6.png
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
If you want to exclude third party OC'ed models, the stock ref AMD 7970 GE is still faster than the stock ref GTX 680 by a hair.

AMD took that crown, all nVidia has to do is release an new stock "OC'ed" ref model or increase GPU boost ceiling through a software patch.

The bigger hurdle is the GTX 680's thermal limit. It doesn't matter how high the card can auto-clock when the AMD card can go higher because it can run hotter.

I don't mind higher heat & power use when I get performance from it.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The difference between 680 and 7970GE / 7970OC is ridiculously small, its great that the competition is so close to each other. I don't think its been this close in a long time in terms of performance...

Nvidia has definitely done a better job at winning consumer mind share though, they did a great job marketing the cards and getting word out in every way possible.

I wouldn't say nVidia did anything better in terms of marketing when compared to their previous cards. If anything, I'd argue they've done less. Outside of forum gossip with not knowing who is paid, the tech news for GTX 680 wasn't as big from nVidia. They let the product speak for itself.

AMD's own incompetence basically gave them the lime light. nVidia didn't even have to accelerate to a jog to win this race, AMD fell on their face half way through.

The bigger hurdle is the GTX 680's thermal limit. It doesn't matter how high the card can auto-clock when the AMD card can go higher because it can run hotter.

I don't mind higher heat & power use when I get performance from it.

That's a point to make. A GTX 680 in a stuff box will suffocate and throttle. The HD 7970 will just take off with it's jet engine cooler.

Trade-offs, people should go for what they prefer.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
The bigger hurdle is the GTX 680's thermal limit. It doesn't matter how high the card can auto-clock when the AMD card can go higher because it can run hotter.

I don't mind higher heat & power use when I get performance from it.

You can increase Power Limit on GTX 670/680 so i does not downclock when passing 100% powertarget.

When they reach 70c they will also downclock. However, it downclocks by a tiny amount. This can easily be fixed using a custom fanprofile. These cards are cool and don't need to be noisy, even with a custom fanprofile.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Railven, I disagree completely. Nvidia really hit up social networking places very hard with the 680 launch, as an example i'll say that I watch various pro gaming youtube casts from well known people (swifty, and some others) and they were all given 680s by nvidia to hawk them during their youtube air time. Nvidia also gave their cards to several well known Starcraft 2 celebrities, and other various pro gamer personalities. You are very wrong, they did make a very very strong marketing push especially with social marketing and well known youtube personalities. You can see that they got their wares from nvidia because they have the GTX 680 box that was only given to reviewers with the nvidia logo on it (if you've seen it, the reviewers 680 is in a very distinctive box and isn't available for sale), and they were paid to mention them during air time. I've even asked most of them and they confirmed that it was donated to them by nvidia marketing.

I don't participate in pro gaming obviously but I know i've seen at least 10 youtube gamers that are well known, swifty being the most well known, who were given GTX 680s and were paid to mention that they switched to it on their channels. Stuff like that definitely affects mind share of viewers.

Furthermore, ads for the 680 have appeared on front pages everywhere, hell I think i've seen ads on even non gaming or PC websites. I saw a 690 ad on MSNBC of all places... They have done a great job, and you are wrong that they let the hardware speak for itself....it was a big effort on their part. AMD has done nothing like this.

Now all of this said, i'm not saying that the Kepler is all marketing with no substance. It is a fantastic card and I really enjoy using my 680 SLIs, they are great. However, the combination of AMD messing up in some key aspects and nvidia making a big marketing push gave nvidia a clear mind share win.
 
Last edited:

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
You can increase Power Limit on GTX 670/680 so i does not downclock when passing 100% powertarget.

When they reach 70c they will also downclock. However, it downclocks by a tiny amount. This can easily be fixed using a custom fanprofile. These cards are cool and don't need to be noisy, even with a custom fanprofile.

I'm aware of the power limit tweaks which is why I only said the thermal limit because I feel that is the real obstacle. I live in Northern Illinois, it's currently 91F out side and despite my AC being on inside my PC case is still in the 80s.

If I ran an Nvidia card it would take a big performance hit trying to stay cool here. I'd have to water block the thing.

I've also seen Nvidia fanyboys recommend GTX 670s to people in India. Even after the person explained they're from India and the heat is a problem for their computing & gaming.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
I'm aware of the power limit tweaks which is why I only said the thermal limit because I feel that is the real obstacle. I live in Northern Illinois, it's currently 91F out side and despite my AC being on inside my PC case is still in the 80s.

If I ran an Nvidia card it would take a big performance hit trying to stay cool here. I'd have to water block the thing.

I've also seen Nvidia fanyboys recommend GTX 670s to people in India. Even after the person explained they're from India and the heat is a problem for their computing & gaming.

First of all, a Kepler card can be rated at 980MHz boost clock, but in most cases/all cases (Over spec) the actual Kepler boost in games will vary from about 1084MHz and up. Many close to 1200MHz. So their already performing way over spec.

Secondly, when it downclocks, it is by a tiny amount. So no matter how you twist and turn this, it will perform more than advertised.

Edit: My two ASUS GTX 670 Direct CU II cards are rated at 980MHz boost. In games one card boosts to 1084MHz and the other to 1124MHz.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah i hear ya,best cooler ive messed about with.shockingly quiet under load.

:thumbsup:

Assuming the chip can do 1280-1300mhz, the Dual-X/Toxic cooler can easily handle it. I have lots of room to spare in terms of GPU temps but my card can't overclock to those speeds.

sapphiredualxhd7970115g.jpg


Looks like AMD has the fastest single GPU in the world now:

"The performance results are unquestionably impressive. In 7 out of 11 tests, The Sapphire HD7970 6GB Toxic Edition outperformed the KFA2 GTX680 Limited OC Edition."
 
Last edited:

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
If you want to exclude third party OC'ed models, the stock ref AMD 7970 GE is still faster than the stock ref GTX 680 by a hair.

AMD took that crown, all nVidia has to do is release an new stock "OC'ed" ref model or increase GPU boost ceiling through a software patch.

OK, but I take it the Toxic was tested at it's 1200Mhz OC setting right? That's a 20% boost over the 7970 GE... which I thought lost slightly more than it won when it went head to head with the GTX 680? I guess the reviews I read didn't provide a clear enough picture of overall performance.

I honestly thought that a 75Mhz boost over the 7970's original clock speed wasn't enough to make it faster than the GTX 680. I thought it went along the lines of:

GTX 680 faster at stock
GTX 680 is slightly faster at the same clockspeeds (e.g. both at 1.1Ghz)
The 7970 has more overclocking headroom and ultimately wins out, particularly at higher resolutions.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
OK, but I take it the Toxic was tested at it's 1200Mhz OC setting right? That's a 20% boost over the 7970 GE... which I thought lost slightly more than it won when it went head to head with the GTX 680? I guess the reviews I read didn't provide a clear enough picture of overall performance.

I honestly thought that a 75Mhz boost over the 7970's original clock speed wasn't enough to make it faster than the GTX 680. I thought it went along the lines of:

GTX 680 faster at stock
GTX 680 is slightly faster at the same clockspeeds (e.g. both at 1.1Ghz)
The 7970 has more overclocking headroom and ultimately wins out, particularly at higher resolutions.
the 680 is NOT faster than the 7970 at the same clocks.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
OK, but I take it the Toxic was tested at it's 1200Mhz OC setting right? That's a 20% boost over the 7970 GE... which I thought lost slightly more than it won when it went head to head with the GTX 680? I guess the reviews I read didn't provide a clear enough picture of overall performance.

I honestly thought that a 75Mhz boost over the 7970's original clock speed wasn't enough to make it faster than the GTX 680. I thought it went along the lines of:

GTX 680 faster at stock
GTX 680 is slightly faster at the same clockspeeds (e.g. both at 1.1Ghz)
The 7970 has more overclocking headroom and ultimately wins out, particularly at higher resolutions.

Yeah, but AMD snuck in their own GPU boost which I believe gets it to 1050mhz?

The 7970GE won more than it lost (factor in the driver update) but it is so marginal all nVidia has to do is give a little more stock Boost or release a performance driver.

It really is semantics, but if you were going to draw a line in the sand about excluding products then you'll have to look at what is currently drawn in the sand.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Railven, I disagree completely. Nvidia really hit up social networking places very hard with the 680 launch, as an example i'll say that I watch various pro gaming youtube casts from well known people (swifty, and some others) and they were all given 680s by nvidia to hawk them during their youtube air time. Nvidia also gave their cards to several well known Starcraft 2 celebrities, and other various pro gamer personalities. You are very wrong, they did make a very very strong marketing push especially with social marketing and well known youtube personalities. You can see that they got their wares from nvidia because they have the GTX 680 box that was only given to reviewers with the nvidia logo on it (if you've seen it, the reviewers 680 is in a very distinctive box and isn't available for sale), and they were paid to mention them during air time. I've even asked most of them and they confirmed that it was donated to them by nvidia marketing.

I don't participate in pro gaming obviously but I know i've seen at least 10 youtube gamers that are well known, swifty being the most well known, who were given GTX 680s and were paid to mention that they switched to it on their channels. Stuff like that definitely affects mind share of viewers.

Furthermore, ads for the 680 have appeared on front pages everywhere, hell I think i've seen ads on even non gaming or PC websites. I saw a 690 ad on MSNBC of all places... They have done a great job, and you are wrong that they let the hardware speak for itself....it was a big effort on their part. AMD has done nothing like this.

Now all of this said, i'm not saying that the Kepler is all marketing with no substance. It is a fantastic card and I really enjoy using my 680 SLIs, they are great. However, the combination of AMD messing up in some key aspects and nvidia making a big marketing push gave nvidia a clear mind share win.

Not saying that I think you're wrong, but from the things I've seen - the GTX 680 isn't really that present MORE so than the GTX 580 was. The GTX 680 wasn't even the focus of one of nVidia's own hosted LANs. While the GTX 580 launched on the heals of the new Call of Duty (come on now, that alone is huge.)

If Diablo 3 was plastered with GTX 680 stickers I'd agree with you, but it wasn't. The biggest PC title release to date and not a word from either vendor waiting to jump on that pony.

Possibly also the time frame, March vs November (before Christmas.) I'd still say the GTX 580 had far more advertising than the GT 680 did, again outside forums (which it seems the GTX 680 really succeeded, but through in the whole "who's paid to post" since we all know nVidia lets their employees post online too.)
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Lets them post?

But Nvidia is not caliphat! 1st amendment? LOL

The only ones posting on forums, quoting every possible leak - no matter how obviously fake, and later complaining about Kepler hype, are kooks like us :D
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
LOL 6GB . . . that's not gonna get used up anytime soon.

Skyrim, Witcher 2, Battle Field 3, Max Payne 3 all already use over 2GB when set to their Max settings. Start adding in texture packs and other mods and yes it will jump higher. Especially for those of us with higher than 1080p monitors.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
LOL 6GB . . . that's not gonna get used up anytime soon.
its not that you would need 6gb but that you may need more than 3gb. for some people running mods at insane multi monitor resolutions then 3gb may not be quite enough in some cases.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
OK, but I take it the Toxic was tested at it's 1200Mhz OC setting right? That's a 20% boost over the 7970 GE... which I thought lost slightly more than it won when it went head to head with the GTX 680? I guess the reviews I read didn't provide a clear enough picture of overall performance.

I honestly thought that a 75Mhz boost over the 7970's original clock speed wasn't enough to make it faster than the GTX 680. I thought it went along the lines of:

GTX 680 faster at stock
GTX 680 is slightly faster at the same clockspeeds (e.g. both at 1.1Ghz)
The 7970 has more overclocking headroom and ultimately wins out, particularly at higher resolutions.

The "stock" GE model boosts to 1050MHz and is a little faster overall than the standard 680's the reviewers had. This one is faster than the fastest 680 that boosted to over 1200MHz in KitGuru's review. Again, by a bit.

Obviously, there's no current limiting or shenanigans sensing .exe's and throttling for the benchmarks going on with this card. It can use, and handle, massive amts. of power!
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm aware of the power limit tweaks which is why I only said the thermal limit because I feel that is the real obstacle. I live in Northern Illinois, it's currently 91F out side and despite my AC being on inside my PC case is still in the 80s.

If I ran an Nvidia card it would take a big performance hit trying to stay cool here. I'd have to water block the thing.

I've also seen Nvidia fanyboys recommend GTX 670s to people in India. Even after the person explained they're from India and the heat is a problem for their computing & gaming.

I'm in FL, the other day is was 98F. I have no problems with GTX 670 heat to speak of.