HEXUS: GTX 680 vs. HD 7970 @ same clocks

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81

That even though power consumption increases exponentially on the 7970 when overclocking while it increases linearly in the 680, it still doesn't limit the 7970's ability to overclock very similarly (in terms of a set frequency; in terms of percentage the 7970 overclocks higher). But while they overclock to similar frequencies, the 7970 gains more from it. I don't think it was difficult to understand what I wrote if you were paying attention.
 
Last edited:

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Amen, too much fanboy mud slinging. Here's to a totalitarian crackdown where unless you own either the GTX 680 or the HD 7970 (with proof, of course), you can't post about them. It's stop a lot of this nonsense.

Regarding the article, there's nothing surprising. Games that perform better on one manufacturer over the other still will perform better over the other after these minor overclocks (8%). It's when you get to maximizing the cards that you see the strengths in the architectures.

Seriously this. People around here defending this Boost bs til their last dying breath, with soo much "scientific analysis", and don't even own the card lol.

Can we get a separate sub forum for 79xx/6xx owners, entry by proof of purchase only, for civilized discussion and comparison based on experiencee? Thanks. :cool:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I like the idea of GPU boost -- in the context of TDP and acoustics. Why leave it on a forced setting like the past when the hardware and software can dynamically change things and offer more performance for the gamer while being under the TDP, while keeping it cool and quiet?

Sounds like a no-brainer.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I like the idea of GPU boost -- in the context of TDP and acoustics. Why leave it on a forced setting like the past when the hardware and software can dynamically change things and offer more performance for the gamer while being under the TDP, while keeping it cool and quiet?

Sounds like a no-brainer.

It may sound like a no-brainer, but in practice and experience it's not. Cards have downclocked at idle, where it matters most, for years and years now - so that is nothing new.

These cards are cool and quiet whether they are going full out or not. GPU boost is a PITA at times.

Never mind that you can't control voltages on these cards anymore. GPU boost/lack of voltage control is pretty meh all round unless you are a neophyte who just plugs their card in and goes.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
"Unless you are a neophyte"?
"It sounds like a no brainer, but in practice it's not"?
This isn't yet another "unfair" Nvidia advantage argument is it? Cause as the years progress, it seems Nvidia is piling up the "unfair" advantages over it's competition. I can't see that as a bad thing for Nvidia customers. Sorry.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
"Unless you are a neophyte"?
"It sounds like a no brainer, but in practice it's not"?
This isn't yet another "unfair" Nvidia advantage argument is it? Cause as the years progress, it seems Nvidia is piling up the "unfair" advantages over it's competition. I can't see that as a bad thing for Nvidia customers. Sorry.

I guess that would depend on the customer? Seeing a few enthusiast upset at the lack of total control on their overclocks/settings seems to be a common complaint. Validity is subjective, but hey - it's still a concern for some.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
"Unless you are a neophyte"?
"It sounds like a no brainer, but in practice it's not"?
This isn't yet another "unfair" Nvidia advantage argument is it? Cause as the years progress, it seems Nvidia is piling up the "unfair" advantages over it's competition. I can't see that as a bad thing for Nvidia customers. Sorry.

Well lets see, voltage control has been on nvidia cards since the 8800GTX and before that. I am 100% on the G80 series and every card since, but pretty sure it was there on the 7XXX and 6XXX series as well.

No voltage control on the $500 gtx 680 flagship. Yes it's unfair, unfair for the enthusiast customers buying them and no longer able to tweak the cards as they have been in the past. It's not ideal on a card that is topped out and you can't give a mV more to get it over the hump.

**Disclaimer** I like my GTX 680s, they are excellent cards and lay waste to Fermi in every way.

No need to get defensive. There are plenty of heads getting scratched around the web amongst buyers of these cards who are realizing they can't adjust the voltage and are stuck. Over on the EVGA forums jacob has confirmed that all the cards are locked down to the 1.2V limit, even the custom non-ref classified and hydro copper cards to come. :confused:

It could very well be a case that these cards are already up against the wall and more voltage will not push them over. They already perform amazingly well for the given die size. They could very well lack any headroom to get over that hump without insane modifications like adding on extra power phase boards and using liquid nitrogen to cool them.
 
May 13, 2009
12,333
612
126
What is the point of a non ref gtx 680 if they are still voltage locked? Afaik the 680 stays plenty cool and quiet with ref design. If the non reference design doesn't add voltage unlocking then it really isn't adding anything of value over the ref design.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
What is the point of a non ref gtx 680 if they are still voltage locked? Afaik the 680 stays plenty cool and quiet with ref design. If the non reference design doesn't add voltage unlocking then it really isn't adding anything of value over the ref design.

Crank the fan to +75% and tell me the ref is still quiet. Some of the OC's I've read have their fan profile in the 90%. I always manually crank the fan up to see how loud they are.

+75% GTX 680 fan is not quiet, I can see a custom cooler fixing that for those that noise is an issue.

EDIT: http://www.youtube.com/watch?v=gzTaR0s_30o
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
"Unless you are a neophyte"?
"It sounds like a no brainer, but in practice it's not"?
This isn't yet another "unfair" Nvidia advantage argument is it? Cause as the years progress, it seems Nvidia is piling up the "unfair" advantages over it's competition. I can't see that as a bad thing for Nvidia customers. Sorry.


I agree with you to a point. It is a feature out of the box, so it is an Nvidia advantage in some situations. I remember when Intel introduced turbo boost, and some people were saying that it is unfair to bench an Intel processor vs. an AMD processor because the Intel processor got an 'overclock'. But, that is a feature out of the box, that's how the chip works. Same thing here with Nvidia's new GPU.

But, with that being said, if the no voltage control is true I think this is a big blow to Nvidia. No matter how much the Nvidia fanboys want to claim that no one overvolts enthusiast level cards (ala GTX590 launch) this is a big negative check mark in my opinion. I did not buy my 7970 to run it at 925MHz. The GTX680 obviously has headroom too, but if there is no ability to give it a bump in voltage it will be relatively limited. As an enthusiast who likes to tinker as much as I like to game these days, this is a big negative to me. Certainly not to everyone, the card is fast as it is, but I like to squeeze as much free performance as I can out of a part... (well, that's not entirely true, my 7970 gets so ridiculously loud that I stopped going higher than 1.2GHz, I'll wait for aftermarket coolers to go higher). Just my $.02.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Well lets see, voltage control has been on nvidia cards since the 8800GTX and before that. I am 100% on the G80 series and every card since, but pretty sure it was there on the 7XXX and 6XXX series as well.


Not to go too far on a tangent, but I don't think the 6xxx and 7xxxx had software voltage control. I remember having to do pencil traces or buy a 'circuit writer conductive pen' (I still have mine in the package, I replaced my 7900 card before I ever got brave enough to try it. :) ) to up the voltage. I could be wrong here, maybe some models did have that ability, but from what I remember to up the voltage you had to work on the board itself.

Anyway, back to the GTX680. :)
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
(well, that's not entirely true, my 7970 gets so ridiculously loud that I stopped going higher than 1.2GHz, I'll wait for aftermarket coolers to go higher). Just my $.02.

I'm in the same boat. After tinkering with my HD 7970, that fan gets insane loud haha. I haven't run into a game that requires that much more oomph, yet, but WoW Beta is making me think twice.

However, someone spinning my comment about the GTX 680 fan being loud at +75% as some kind of viral marketing slam against the GTX 680? Because only people benchmarking OC their cards and thus have to up the fan speed. Gotcha. Haha, watching him grasp is HEE-LARRY-US!
 

moriz

Member
Mar 11, 2009
196
0
0
The reference radeon hd7970 have software voltage control. Pretty much all high end reference radeons have it.
 

Davste

Member
Jul 8, 2011
97
0
0
Considering one is actually over-clocked, used more power OC and default, less features and flexibility and more money -- doesn't really make them that close.

My beef is the GTX 680 and HD 7970 over-all performance gains are more-so evolutionary and incremental on 28nm and price/performance as a whole has suffered.

I'm sick of people saying this. It depends on where you live - it's SUBJECTIVE TO YOU.

Here, a GTX 680 costs €590 euros
On the other hand, a 7970 can be found for €560.

The prices are expensive and crazy for both cards, but the AMD is cheaper. At this stage, I have only one monitor. Really seing the difference between these two cards - it's too tiny to consider.

What's more important to me is:
1. Silence - It's so hard to find dB vs dB comparisons between these cards. Does nobody really care? Powercolour or XFX 7970? EVGA or Palit 680? Which is more silent? From the information I have managed to scrap together, the XFX 7970 sounds like a jet engine while the EVGA 680 is relatively silent. But I want numbers, in dB. Also, either one of these cards is going to cost me big time. So there is no way I'm going to spend even another €50 on a custom cooler - let alone a €100 one.

2. Reliability. To be honest, I've had AMD for quite a while, and the drivers aren't as bad as people make them sound. Although I have my video card crash with a "display driver error" once in 2 or 3 months - or some games just won't work well at all - like rage (but wasn't a very good game anyway, and that's the only really unreliable example I can give).


All in all, silence matters the most. I hate having a graphics card that sounds like a jet while I'm playing. I have open earphones which are great for determining the enemy position by sound - but not so great at sound proofing. I've had a 6850 PCS+ for a while, and to be honest - I can't hear it. But I want to be able to play battlefield 3 smoothly, full resolution, on ultra - and the 680 seems to be the beast to cut it. I really hope it's not loud.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Prices are getting worse in Scandinavia -_-

GTX 680 all sold out... oh wait... one shop has some Zotac Cards.

Let's check the price - certainly they wouldn't up the price just cause they're the only ones... would they ? :O

NlUqx.jpg


...which one of you will convert from Danish Crowns (DKK) to usd first ? ;)


Come on.
It's been like that for years, due to import taxes and regulear taxes...not MSRP

Ever since the...hmmm...Ti4600 I have always just put a zero behind the price in $.

$200 = 2000 DKK
$500 = 5000 DKK

Even if the currency rate says $200 = 1136,32 DKK and $500 = 2840,80 DKK.

That is close to 50% more than the exchange rate...and people from the US whine about prices...lol
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Come on.
It's been like that for years, due to import taxes and regulear taxes...not MSRP

Ever since the...hmmm...Ti4600 I have always just put a zero behind the price in $.

$200 = 2000 DKK
$500 = 5000 DKK

Even if the currency rate says $200 = 1136,32 DKK and $500 = 2840,80 DKK.

That is close to 50% more than the exchange rate...and people from the US whine about prices...lol

Woof, glad I live in 'Murica. First world problems.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Same clocks? Who cares, they are completely different architechtures.

exactly

the better "equalization" comparison would be tweaking each so that they have the same power draw at load, but even that is silly as it would give the 680 a huge advantage because the 7970 would be neutered of its potential

but yeah, equalizing clock rates is completely silly, just have to look at Netburst
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I'll be waiting for the next review that equalizes power draw, fan noise, Price, ROPs, case stickers, and any other arbitrary metric.