Geforce GTX 680 classified power consumption with overvoltage revealed, shocking.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This power consumption thing really doesn't matter because people willing to do voltage adjustment know that they will draw more power.

:thumbsup:

I think blackened23's point has more to do with the fact that people who criticized power consumption on a competing product aren't bringing this up when an overvolted 680 is nearly as bad. I think it's more about him pointing out this double standard.
 
Last edited:

MrMuppet

Senior member
Jun 26, 2012
474
0
0
1. For starters, the reason I have no idea why you are even talking about HD7970 GE voltage...

2. Already linked 2 such reviews... GTX680 can't win. Also, why are you ignoring...
1. Because it drew 40W more @ stock in the chart in the OP than the overclocked 680C and the OP was arguing about performance per watt. Even the stock 7970 vanilla drew ever so slightly more than the overclocked 680C.

2. How is that relevant to anything I've written or performance per watt?

What on earth is your point in quoting me and writing all this? In my first post in this thread I made it quite clear:
"Kepler gives much better performance per Watt and generated heat at this point. Only you can determine whether that's important to you."

I don't care whether you or anyone else buys AMD or Nvidia, do you? If not, what's with the sales pitch?

(However I do prefer if people's decisions are informed, regardless of the outcome. And an honest matter-of-fact debate.)
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I have good news for you Cmddredd:

Today, 10:13 | posts: 9,666 | Location: Taganrog, Russia
Some update and good news:

I decided to shift my vacation a bit and my Lightning finally arrived yesterday. Happily I managed to bypass the main problem with new I2C bus access on Keplers in less than two hours after installing the card. So now all 3Lightning voltage controllers (core, memory and PLL) and thermal controllers are visible to software and programmable. There are still some things left to do, right now core voltage control disables dynamic core voltage adjustment, so changing voltage results in setting maximum fixed voltage in idle as well. But anyway even now it is better that "voltage control" offered on any 680 card by other vendors. I hope that I'll be able to implement alternate voltage control in offset form for dynamic Kepler volatge control.

Alexey Nicolaychuk aka Unwinder, RivaTuner creator

The GTX 680 lightning is getting over voltage...via software at no additional cost.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
:thumbsup:

I think blackened23's point has more to do with the fact that people who criticized power consumption on a competing product aren't bringing this up when an overvolted 680 is nearly as bad. I think it's more about him pointing out this double standard.

Yeah maybe, but really we weren't talking about forcing voltage through the GPU were we? I thought people said the GTX680/670 was much more effecient and less power hungry than the 7970 at stock settings? Either way everyone knows that when you start raising voltage you increase power consumption.

It's great that Unwinder managed to get voltage adjustment working for MSI cards. Last I heard he went on vacation and the card from MSI didn't arrive and he was not very optimistic about getting it working. Cool, another reason to recommend the MSI cards (if someone is really interested in overclocking). I just hope he can get it work work in offset mode so you can scale back at idle.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I looked at it as an over-clocked and sometimes an over-volted HD 7970 offered similar performance as a GTX 680 or an over-clocked GTX 680. The watt difference was significant.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Another thing. Nvidia says software overvoltage control is forbidden. What will happen to MSi if they release Afterburner with software voltage control thanks to Unwinder? Will they be dropped as a partner or something? If they did that, they would piss off a lot of people and MSI is a pretty big company.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Another thing. Nvidia says software overvoltage control is forbidden. What will happen to MSi if they release Afterburner with software voltage control thanks to Unwinder? Will they be dropped as a partner or something? If they did that, they would piss off a lot of people and MSI is a pretty big company.

Pretty sure there is no source indicating that is the case except for EVGA employees. MSI has a good relationship with nvidia and has for a long time, they are one of their biggest resellers in asia and europe. MSI in europe/asia is like EVGA in the states. One of their biggest and most respected AIB producers.

I think its a bunch of BS. So you can't do it via software, but EVGA can sell EV Bots for 99$ and then its perfectly okay. Whatever. IMO its EVGA damage control because they want to sell their stupid EV bot garbage.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yeah that's what I thought too. I understand the idea of Nvidia not wanting people to use reference 670 PCB for overvolted and overclocked cards. It's a piece of crap lol. However, not allowed to use software to overvolt? I think that Nvidia would tell the partner that they assume any liability from this and will be responsible for honoring warranty etc. Can't hold Nvidia responsible if someone presses 1.6v in by accident and the GPU pops.

Really now that I think about it. EVGA could probably have asked Unwinder to add it to PrecisionX but they offer PrecisionX for free and they can sell the EVBot for $100 a pop.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
In my first post in this thread I made it quite clear:
"Kepler gives much better performance per Watt and generated heat at this point. Only you can determine whether that's important to you."
You are largely exaggerating with performance per watt numbers, sir. Only 4%, according to this chart. Consider also the fact, that 7970 carries an extra gig of video ram. And if undervolted, 7970 will easily beat 680 in this department.



Uploaded with ImageShack.us
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I looked at it as an over-clocked and sometimes an over-volted HD 7970 offered similar performance as a GTX 680 or an over-clocked GTX 680. The watt difference was significant.

How did you conclude that? I haven't seen a single review that overclocked an HD7970 on stock voltage to 1.15ghz+ and did power measurements under these circumastances.

What reviewers tend to do is take a stock HD7970, test that. Then crank up volts to 1.25-1.3V and go for max overclocks. Some of those 7970 cards can only reach 1175mhz for example, but the reviewer put 1.25V into the card to do it. Essentially the power consumption measurements from such review are not very relevant. Who is going to use 1.25V over stock for another 25 mhz overclock and 50W extra power consumption?

Under 1.175V, my 1.15ghz 7970 is using up 192W of power in HWInfo64. If anyone has an overclocked GTX680 to 1.25ghz or so, they should post their power consumption in HWInfo64 to compare. Maybe blackened23 can do it with his overclocked MSI Lightning 680 :)

I earlier noted by going from 1.175V to 1.256V, 7970's power grows 42-43W without doing anything to the clock speeds.

Also, what about the price difference between 7970 and 680? Are those 40-50W extra power worth spending $220+ for the EVGA 680+EVbot over say MSI Lightning 7970? Why is this card so overpriced compared to the MSI GTX680 Lightning? The way I see it GTX670 is the most balanced card this generation @ $400, while HD7970 gives full voltage control with overclocking for $450.

To get voltage control with an OK cooler, EVGA is charging $740. A $740 supposedly top of the line 680 which can't even beat a handful of aftermarket overclocked $450 7970s. That's almost a $300 price difference. For this much $, you can almost get 2x Giga GTX670s!

If you read the details of how EVBot works, it's sub-par, it appears that it requires redialling settings every time you re-boot. In the context of 7970s and other 680s, the EVGA Classified 680 is a rip-off/hack job:

"if the card is fully powered down you’ll need to reset the desired voltages the next time the card is powered up. For anyone intending to use an overvolted card on a regular basis, this means you’ll need to keep an EVBot plugged in at all times so that you can reset the voltages."

Performance/watt, GTX670/680 cards win, with GTX670 OC after market cards being the standouts to me. However, when already discussing 300-350W power consumption of total systems, another 40-50W of power is irrelevant imo. Other things such as performance in the games you play, feature preferences, warranty, card/game bundles are more important imo.

And none of the modern cards consumes as much power as the GTX580:
Power.png


The bottom line is the most expensive single-GPU card should be hands down the fastest and have a great cooler to boot. The Classified is 0/2 in both of those metrics.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
You are largely exaggerating with performance per watt numbers, sir. Only 4%, according to this chart. Consider also the fact, that 7970 carries an extra gig of video ram. And if undervolted, 7970 will easily beat 680 in this department.



Uploaded with ImageShack.us

Interesting!

Load power on the other hand looks very good. In fact it’s much better than we were expecting. Despite the additional memory chips and the factory overclock, under Metro power consumption only rises 10W at the wall. This is still less than the7970, let alone the 7970GE.

http://www.anandtech.com/show/6096/evga-geforce-gtx-680-classified-review/7
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
To achieve GTX 680 performance from over-clocking with a HD 7970; the watt difference between a GTX 680 and an OC HD 7970 is significant.

Again you provided no evidence to support that claim. You keep linking AT's review where there is 57W difference between the 7970GE and the SC card. I linked TechSpot/LegionHardware review where a HD7970 GE with a conservative overvolted bios from the factory draws just 34W more than a system with a GTX680 (320W vs. 354W). However, you can overclock all HD7970 cards to 1050mhz on stock 1.175V, which would narrow the power consumption difference even more. So the 57W is almost irrelevant since HD7970 users won't be running their 1050mhz cards at 1.2-1.25V. That's not real world usage, especially since no 7970GE cards are even for sale.

TechReport has GTX680 system at 286W vs. 302W for the 7970 GE.

I guess if you think 35-40W extra power consumption on a system that draws ~ 300W to begin with is significant, then sure.

To me significant is like HD7850 OC with same performance as the GTX480 and drawing 160W less.

Interesting how after Fermi, 40-50W of power consumption is significant now but a $660 vs. $450 price for similar performance in overclocked states is conveniently omitted. Or is performance/watt suddenly the most important metric for $450+ GPUs?
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Agreed. The apples to apples chart is not shocking.

I keep hearing about unnecessary voltage and/or to low factory clock speeds implemented.
Why assume Tahiti can be run more efficiently when it's not implemented by AMD ? Maybe the AMD engineers know something ?

No one can recommend this evga model at this price, but o/c over-volting power consumption isn't that surprising.
48489.png
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Notty, first, its quite ridiculous to say that overvoltage is unnecessary. Voltage has been the bread and butter of overclocks for a long time and it does give tangible performance increases if the architecture allows it. Unfortunately, the GTX 680 classified does not have a special BIOS to disable thermal throttle so anytime the card goes above 70C the clockspeeds start rapidly going down. So the performance increase of overvoltage isn't fully realized with that card, but the lightning should do much better in that respect. Note that the 680Classified hits > 80C often while OC+OV so it is hitting thermal thresholds and is downclocking.

I assume you know this but the chart you linked shows an over volted 7970 compared to a stock voltage GTX 680. Comparing apples to apples stock, the 7970 is using 25-30W more, while at OC+OV settings the Kepler uses more than the OC+OV 7970. You're talking about people who spend 500-600$ on cards and often times combine them for SLI and crossfire - these people do not care about efficiency, they want performance and with that said they want voltage. Now with that said, my entire point of this thread was to point out that the efficiency that everyone believed in with Kepler was mostly a side effect of voltage lock - now that it is gone, we see that kepler is quite the power guzzler. Now the side effect of power consumption is noise and heat and we see that kepler really fares no better than the 7970 does in terms of thermal output, and often time exceeds it - however, the 680 does have a much better reference cooler than the 7970 does. The 7970GE has a horrible reference cooler which as toyota likes to call it, is a torture chamber - so this is something AMD should fix with their next outing.

In closing, I think kepler throttle and voltage lock is a bunch of rubbish. I like overclocking, I like doing what I please with my 500$ purchase. While I love my 680 cards, I'm not really happy with how nvidia handled OC/OV protection on the 680-- i'm hoping this doesn't happen with GK110.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I don't understand why throttle at 70c to begin with. What am I missing that makes it necessary for a GTX 680 or 670 to throttle at 70c?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Again you provided no evidence to support that claim. You keep linking AT's review where there is 57W difference between the 7970GE and the SC card. I linked TechSpot/LegionHardware review where a HD7970 GE with a conservative overvolted bios from the factory draws just 34W more than a system with a GTX680 (320W vs. 354W). However, you can overclock all HD7970 cards to 1050mhz on stock 1.175V, which would narrow the power consumption difference even more. So the 57W is almost irrelevant since HD7970 users won't be running their 1050mhz cards at 1.2-1.25V. That's not real world usage, especially since no 7970GE cards are even for sale.

TechReport has GTX680 system at 286W vs. 302W for the 7970 GE.

I guess if you think 35-40W extra power consumption on a system that draws ~ 300W to begin with is significant, then sure.

To me significant is like HD7850 OC with same performance as the GTX480 and drawing 160W less.

Interesting how after Fermi, 40-50W of power consumption is significant now but a $660 vs. $450 price for similar performance in overclocked states is conveniently omitted. Or is performance/watt suddenly the most important metric for $450+ GPUs?


Interesting, maybe you missed this -- my context is HD 7970.

http://www.computerbase.de/artikel/grafikkarten/2012/test-amd-radeon-hd-7970-ghz-edition/12/
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
This is a really terrible card due to its price. Massive ripoff and the need for that EVBOT seals the deal.

Fail on EVGA's part. Kepler/GTX 680 is just not an ideal card for traditional voltage and overclocking. Even if you manage to get the core up high, you cannot get the memory fast enough to give the core the bandwidth it needs.

Hopefully GK110 does not continue the failure of not being able to be traditionally overclocked with software voltage controls and does not keep forcing GPU boost on us.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't understand why throttle at 70c to begin with. What am I missing that makes it necessary for a GTX 680 or 670 to throttle at 70c?

Nvidia protecting us from ourselves, thats the only thing I can think of. This is fine for newbies but for overclockers it is horrible.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Notty, first, its quite ridiculous to say that overvoltage is unnecessary. Voltage has been the bread and butter of overclocks for a long time and it does give tangible performance increases if the architecture allows it. Unfortunately, the GTX 680 classified does not have a special BIOS to disable thermal throttle so anytime the card goes above 70C the clockspeeds start rapidly going down. So the performance increase of overvoltage isn't fully realized with that card, but the lightning should do much better in that respect. Note that the 680Classified hits > 80C often while OC+OV so it is hitting thermal thresholds and is downclocking.

I assume you know this but the chart you linked shows an over volted 7970 compared to a stock voltage GTX 680. Comparing apples to apples stock, the 7970 is using 25-30W more, while at OC+OV settings the Kepler uses more than the OC+OV 7970. You're talking about people who spend 500-600$ on cards and often times combine them for SLI and crossfire - these people do not care about efficiency, they want performance and with that said they want voltage. Now with that said, my entire point of this thread was to point out that the efficiency that everyone believed in with Kepler was mostly a side effect of voltage lock - now that it is gone, we see that kepler is quite the power guzzler. Now the side effect of power consumption is noise and heat and we see that kepler really fares no better than the 7970 does in terms of thermal output, and often time exceeds it - however, the 680 does have a much better reference cooler than the 7970 does. The 7970GE has a horrible reference cooler which as toyota likes to call it, is a torture chamber - so this is something AMD should fix with their next outing.

In closing, I think kepler throttle and voltage lock is a bunch of rubbish. I like overclocking, I like doing what I please with my 500$ purchase. While I love my 680 cards, I'm not really happy with how nvidia handled OC/OV protection on the 680-- i'm hoping this doesn't happen with GK110.

The HD 7970 ghz edition cores may be more efficient. It's the first thing I investigated.


http://www.computerbase.de/artikel/grafikkarten/2012/test-amd-radeon-hd-7970-ghz-edition/12/
 

MrMuppet

Senior member
Jun 26, 2012
474
0
0
You are largely exaggerating with performance per watt numbers, sir. Only 4%, according to this chart. Consider also the fact, that 7970 carries an extra gig of video ram. And if undervolted, 7970 will easily beat 680 in this department.

http://img39.imageshack.us/img39/44/perfwatt.gif
What's up with people, including you my good sir, not linking to the sources for their charts for others to scrutinize? I didn't come for nor anticipate these "debates" and it's pretty tiring and futile to defend common sense positions as it is, but whatever.

First of all and most importantly that chart assumes that the GTX 680 2GB draws 3 W more than HD 7970 3GB (TPU calls it "typical gaming power consumption"), see this chart here (source). Are you happy with that figure? The stock GTX 680 drawing more than the stock HD 7970 that is?

In Metro 2033 (which pretty much everyone else uses, including AT) the 7970 draws ~30W more than the 680. Going by AT's numbers presented in the chart in the OP, the performance per watt advantage would increase to:

1.00 / (0.96 * 163W / (163W + 3W + 29W)) = 1.00 / (0.96 * 163W / 195W) = 1.2461...

As per that calculation GTX 680 provides 24.6% more performance per watt than HD 7970. In my humble opinion, you and TPU are the ones understating the performance per watt difference. And for me it actually matters (the heat produced, not the electricity bill) because my closet has virtually no ventilation.


Less importantly, even GTX 670 (74W less) and HD 7970 are trading blows performance-wise (either being better at different games*), so it really comes down to game preferences and resolutions. The typical scenario for single cards is 1080p. You don't need more than 2GB for that today (3GB is good for multi-monitor setups and 1.5GB would be danger close so the next logical step at 384-bit is 3GB). Using the performance numbers at 1200p the advantage rises to almost 26% in favor of the GTX 680.

* Blog time: I've actually put my money where my mouth is and bought both (for about the same price). I'm leaning towards the GTX 670 even though it will be more expensive than the 7970 Dual-X since I'll have to buy a cooler for it to reduce the noise. A 7970 reference would've been cheaper than the GTX 670 reference, but the Accelero Xtreme 7970 is twice as expensive as the Accelero Twin Turbo II.

670and7970smaller.jpg


Sir, can I please go do something more worthwhile now? :)

edit: It appears my pic got resized anyway, sigh.
edit2: Accidentally used 166W (instead of 163W) as the base originally. Since I've already edited twice, I might as well fix that pic.

edit3: Forgot to address your last statement. Either provide support for statements like that or don't make them. Moreover, if you undervolt the one you have to undervolt the other and it wouldn't invalidate comparisons at stock anyway.
 
Last edited:

Ryan Smith

The New Boss
Staff member
Oct 22, 2005
537
117
116
www.anandtech.com
I'd love to see AT do a review on it [680 lightning] once the next afterburner is released (2.2.3 is adding support for the lightning , but it is not out yet)
I don't have the lightning, but I do have the GTX 670 Power Edition. That will be our next review.:) Though unfortunately it's not looking like voltage control will be ready in time, and it's still nebulous at best whether NVIDIA is going to let MSI do software voltage control.

I don't understand why throttle at 70c to begin with. What am I missing that makes it necessary for a GTX 680 or 670 to throttle at 70c?
Leakage increases with temperature. I.E. as temperature rises, so will power consumption. So NVIDIA reduces clockspeeds by one bin (13MHz) to offset that.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

Sorry, but did you actually read my posts in detail?

I just told you twice now that HD7970 can do 1050mhz on stock voltage and as a response you linked me a review where the testers took a reference HD7970 and manually pumped the core voltage to 1.256 volts. It says it right there in the review.

1) Increasing voltages from 1.175 to 1.256V increases power on 7970 by 40-42W without any increase in clock speed on my card. Why would any HD7970 owner do this if they just want to run the card at 1050mhz clocks? Even getting to 1150mhz can often be done with 1.175V (1.2V in MSI Afterburner due to Vdroop on some cards).

2) What is the point of discussing the jet engine "torture chamber" HD7970 GE cooler when plenty of $450 after-market 7970s are for sale with great coolers?

And for me it actually matters (the heat produced, not the electricity bill) because my closet has virtually no ventilation.

The cooler on the Dual-X is very good even at 1150mhz clocks, nevermind stock 950mhz. If you are not going to overclock your card, your stock DX 7970 should run about 15-20*C cooler than the reference GTX670, and way quieter than either the reference 670 or the even louder Palit JetStream cards. Did you actually open the 7970 card to test this out in your closet? I don't understand the logic of getting a GTX670 and then buying an after market cooler to make it quieter and more expensive than the already quiet and overclockable Dual-X card you bought. Not to mention it takes a GTX680 @ 1290mhz to match an HD7970 @ 1165mhz.

sapphiredualxhd7970115g.jpg


This is a really terrible card due to its price. *Massive ripoff and the need for that EVBOT seals the deal. Hopefully GK110 does not continue the failure of not being able to be traditionally overclocked with software voltage controls and does not keep forcing GPU boost on us.

Agree. I think GPU Boost is here to stay though. Both AMD and NV are using it now. Hopefully NV can figure out a way to keep the GPU Boost but allow manual voltage control and the ability to keep GPU clocks fixed (say in manual/advanced mode). Part of me thinks they did this because their reference 670/680 cards are built to meet the bare minimum spec (4 VRM power phases / dinky GTX670 reference PCB). I think they built their cards a lot cheaper than GTX480/580 and probably were afraid of blown VRMs/power circuitry that was common with OCed 590s.

The reviews on the Egg for the Classified are pretty bad. A lot of the users can't even get past 1300mhz on this $660 card. Considering I've seen plenty of $500-525 GTX680s hit 1240-1260mhz on air, this card getting 1270-1280mhz for $660 is a dud. $80 more to unlock voltage on top of that is just an insulting $ grab, especially when the stock cooler lets the card exceed 70*C at that point and starts throttling GPU Boost diminishing most of the value of the overvolt on the stock cooler.
 
Last edited: