Geforce GTX 680 classified power consumption with overvoltage revealed, shocking.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
+1. That was exactly my first though when reading this thread and seeing how much power an OC/OV 680 pulls. Nvidia's electrical engineering isn't that good at the best of times, but GK104 reference designs are pretty under-engineered and the voltage lock seems essential to stop those cards destroying themselves. Plenty of margin on those cards though so they must be happy.

Imagine an under engineered product winning in many metrics at launch from performance/dollar -- performance/watt -- performance/nm -- performance leadership with single and dual GPU sku's, while bringing innovation and new features to the consumer.

Seems to me, it is engineering prowess that created such balance and winning many metrics.
 

Ryan Smith

The New Boss
Staff member
Oct 22, 2005
537
117
116
www.anandtech.com
Partners wishing to have a card with a base power target over 195W must use a custom PCB with suitable power circuitry. NVIDIA won’t allow partners to ship higher-power cards using the reference PCB.
I've edited that statement just a hair; I didn't realize quite how literal you guys would take it. Partners need a custom board; changing the VRMs would make it non-reference board (though I don't know how satisfied NV would be).
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
I've edited that statement just a hair; I didn't realize quite how literal you guys would take it. Partners need a custom board; changing the VRMs would make it non-reference board (though I don't know how satisfied NV would be).

I think I missread that statement. I thought to myself that Nvidia didn't allow board partners to change things on reference PCB's. Instead they had to make their own custom PCB's if messing with extra VRM's and stuff.

However, I now see that it has to do with the TDP. Nvidia not allowing board partners to use a clean reference with the stock VRM if TDP goes over 195 Watts.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I agree. I have no interest in adjusting higher voltages on my two ASUS GTX 670 Direct CU II cards. Upper card is already reaching 70c with a custom fanprofile, in the most demanding games.

Voltage adjustments will only be suitable for watercooled cards.

Good for you bro. Just buy a hot, loud reference card and be done with it. Oh and have fun with thermal throttling on the reference board as well if you try to adjust clocks at all.

I prefer overclocking and having the option to tweak, I do this to my CPU and do it with my GPU as well. In fact, I enjoy overclocking, this is something that enthusiasts love to do. You can stop deflecting the stupid decision making by various companies by pretending that nvidia is doing this as a favor to us all... This stuff [voltage lock] is fine for reference boards. For aftermarket boards it is quite simply, stupid. Aftermarket boards are very well capable of handling the additional voltage.

Unwinder has voltage working on the lightning, I am looking forward to being able to use it without spending an arm and a leg with a crap EVGA EV bot. Please don't bother repeating the lines stated by EVGA PR / damage control representatives, I do not buy their line that voltage with software is the spawn of satan but "HEY GIVE US 99$ FOR EV BOT AND ITS OKAY!!". Give me a break. Its the same end result by a different means, they clearly want to sell more accessories IMHO. They did the same stunt with the EVGA 580 classified which also had an EV Bot requirement, and they did a similar PR stunt to hide the fact that they wanted people buying their stupid EV Bot. [note that they did remove the 580 classified EV Bot requirement after complaints] Excuse me if I take anything their reps say with a grain of salt, I have better things to do than listen to their salesmen pretend to do us a favor, but are actually trying to fill their coffers with EV Bot sales.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
GPU Boost is very welcomed but so would some volt adjustment flexibility for enthusiast class sku's. It's quite odd not seeing this flexibility for over-clocking enthusiasts over-all.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Imagine an under engineered product winning in many metrics at launch from performance/dollar -- performance/watt -- performance/nm -- performance leadership with single and dual GPU sku's, while bringing innovation and new features to the consumer.

Seems to me, it is engineering prowess that created such balance and winning many metrics.

I dont think anyone really doubts NV engineering prowess :)

But AMD engies must have felt really silly when they saw what NV has done with their GPU Boost.
So they've implemeted their own boost in GHz edition.

However, AMDs PowerTune is missing real time data and relies on empiric data and preconfigured power
states, so it can not be fine tuned like NV GPU Boost.
In the end results were mixed.

I really do feel that GPU boost was the key for Nvidia's, I dare to say victory.

- Q2 has passed, whole lineup sans 7990 is out,
and AMD can't make a buck versus Nvidia's 40nm oldies and GK104, so that's a defeat by all standards.


IMHO OC/OV part of this review is pretty much the proof that NV has executed their boost superbly, and have made traditional overclocking largely obsolete.

Slight digression:
When I see ppl writing "over-engineered", I translate this as "crude", "unbalanced", and "wasted".
570/580 had larger bill of materials for power circutry than 6970/6950, yet no one called those cards over-engineered.
And 570 were not considered good at tanking amirite?

I am pretty sure that due to new power scheme, Nvidia has less blown and RMA cards compared to comparatively-over-engineered Fermi.

So my conclusion is that GPU Boost is here to stay :hmm:
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
GPU Boost is very welcomed but so would some volt adjustment flexibility for enthusiast class sku's. It's quite odd not seeing this flexibility for over-clocking enthusiasts over-all.

Exactly. GPU boost with thermal / OCP throttle is fine on reference cards since they're not nearly as capable in terms of power draw and cooling capacity - however, boards like the asus dc2, lightning, etc are all capable of handling this [additional power/voltage] just fine and the board manufacturer can set acceptable voltage limits.

I can understand voltage lock on reference cards but like I mentioned it is ridiculous to apply that same methodology to aftermarket cards.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I read the article but there is zero info about this anywhere else I looked. I did a search for 2 hours last night.

EVGA forums with employees claiming MSI is cheating. This article with certain info that nobody else is claiming. I think EVGA is feeding everyone a load of BS.

Supposedly there's a chip on the GTX670/680 that doesn't allow access to the voltage adjustments. MSI changed that chip I hear. We will just have to wait and see, but honestly when your card throttles at 70c anyway there is absolutely no point in raising voltage and overclocking beyond what you can achieve at stock levels.

Yea, thats the Richtek RT8802A, which is the voltage controller that all reference 680's use, along with the MSI 670 PE. It offers no software control ability, and controls voltage directly through hardware. Lightning, Galaxy SOC, and Asus DC II use a CHiL 8318, which offers full control and monitoring. Hoping that Afterburner allows something for my DC II too, even though I can overvolt to anything I want now. Software is just easier.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I really do feel that GPU boost was the key for Nvidia's, I dare to say victory. - Q2 has passed, whole lineup sans 7990 is out,
and AMD can't make a buck versus Nvidia's 40nm oldies and GK104, so that's a defeat by all standards.

NV's GPU boost is more advanced than AMD's GE imo but I think it had almost nothing to do with NV's profitability. NV in general has been a better run business for a decade for a lot of other reasons I won't get into (but I'll touch on some of them).

Get this: AMD's GPU business had an operating income of $31 million this quarter while the entire company's operating income was $77 million. You can just imagine how the GPU business is being starved for resources by the struggling CPU business. Management has to allocate resources across the entire company which doesn't focus 90% on graphics only like NV does.

Considering how much $ ATI was making before the buyout, it's clear AMD's problems are not related to their GPU specifications but more of mismanagement. The line-up is as competitive as ever. It's other factors that are in play here such as contract terms, supplier costs, supply chain, poor marketing, etc.

If you look at HD7000 series, it's firing on all cylinders, not any different than 9700Pro/X800/X1800 days where ATI was as fast or faster than NV. Back then ATI made millions of dollars and was worth > $3.5-4 Billion on its own even before the 30% buyout premium. Hardware wise, NV didn't kill AMD this round, not even close. They actually have nothing worth buying against HD7750/7770/7850/7870 or $300 7950 on the desktop. :hmm: Actually, if you want to get into the specifics, the GTX670/680/7970 GPUs don't even matter in terms of profitability since they are low volume selling parts. NV is killing AMD in the highly profitable professional markets (Tesla and Quadro) where they sell the same GK104/GF1xx chips for $2-5k and they don't have $2 Billion in debt on which AMD pays interest.

In other words, for every $1 NV makes, it can reinvest it into R&D, pay higher wages to attract top employees, has more budget for marketing on GPUs alone, etc. etc.

You aren't taking into account 300+ laptop wins for Kepler with their low-end GPUs such as GT630/640M, etc. Those sell in the millions. We'll have to dig into the financial data once NV announces their results but I have a feeling they did really well on the mobile discrete GPU side. Out of the discrete GPU stack, only around 4% of sales come from GPUs which cost > $350. So really, those top cards are used more for marketing the lower end products. Once the average consumer hears that Kepler is a very advanced and power efficient GPU, they want a laptop with Kepler in it. The real $ is not in the GTX670/680s or HD7970s.

Also, you are forgetting that AMD's GPUs are not just good for games now. It's like buying 3 cards in 1 (Games/$ on the side/GPGPU compute). The problem is desktop apps still don't take advantage of compute for the most part, so in a essence it's a gamble that in the future they will or otherwise it's a lot of wasted transistor space.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I really agree with your sentiment. Having owned the 7970s previous I really did enjoy them, I thought and still to this day think it is a great card. With unlocked voltage and a hefty overclock it can easily rock the 680 in certain benchmarks -- but in hindsight if i'm completely honest, AMD made a lot of critical errors early on that really hurt them and is still hurting them. First, like you said, the company is woefully mismanaged and doesn't have the cash reserves that it once had in prior years. Second, while I didn't have any issues with drivers it was mind boggling reading stories from users who were having odd issues with recovering from sleep and various other things.

I think the critical difference for this generation is that nvidia really put a lot of effort into the little touches that make GPUs a good experience in terms of software, and AMD didn't really match that. All that said, the hardware is very good and the software issues are fixed from what I understand now.

It really is frustrating -- the lack of voltage control on the 680s is very, very annoying. So while the 680 has won this round, I do hope that AMD steps it up next time and nvidia doesn't implement this stupid voltage limitation on the gk110.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
@RussianSensation

Imagine Kepler with flat 1100MHz, which is pretty much the max NV could have released it, considering some samples won't go over 1100MHz, and they were/are already wafer constrained.

Such GTX 680 would easily perform 15% lower than reviewed.
It would be even slower than 670.

Of course GPU Boost got almost nothing to do with AMD (bad) buisness,
but it was absolutely crucial for press and consequently mindshare win in this gen.

Kepler architecture was foundation, but Boost was that little touch that forced AMD to drop prices TWICE.

Oh and you guys are wrong if you think that high-end is an absolute niche in discrete offering.
$200 is bread and butter, but $400-500 market is nothing to laugh about.
Same like pro business being THAT much profitable. R&D for one thing is entirely done by GeForce business.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
@RussianSensation

Imagine Kepler with flat 1100MHz, which is pretty much the max NV could have released it, considering some samples won't go over 1100MHz, and they were/are already wafer constrained.

Such GTX 680 would easily perform 15% lower than reviewed.
It would be even slower than 670.
Er, do you mean 1000MHz? Because 1100MHz is roughly (if not a bit higher than) what reference GTX 680 cards boost to today. So a 1100Mhz card would be faster, not 15% slower.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Er, do you mean 1000MHz? Because 1100MHz is roughly (if not a bit higher than) what reference GTX 680 cards boost to today. So a 1100Mhz card would be faster, not 15% slower.

I think he meant base clock, not necessarily boost.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Oh and you guys are wrong if you think that high-end is an absolute niche in discrete offering.

Grooveriding linked info not long ago that showed hard data for units sold on the NV / AMD side. The overall sales of discrete desktop GPUs > $350 price range was around 4% of the entire desktop discrete GPU market. I don't have the link off-hand but this # isn't just something we pulled from thin air. Even during Fermi days, GPUs > $199 comprised just 14% of NV's desktop discrete GPU sales. So you can imagine how small the fraction of GPUs sold at $400-500 is (i.e., GTX670/680/690s).

200doll_chart.gif

Source

$200 is bread and butter, but $400-500 market is nothing to laugh about. Same like pro business being THAT much profitable. R&D for one thing is entirely done by GeForce business.

I agree that you can look at it as GeForce business is necessary to support some costs of the company. While the Consumer Graphics has much higher revenue, the profit margins are much smaller. Once you account for cash flows/profitability, the professional graphics division makes almost as much as desktop+mobile combined. I suppose that can be interpreted depending on how you want to allocate the expenditures to each division. But purely from Gross Margins/Profit (Revenue - Cost of Goods Sold), the professional graphics division kills consumer GPUs.

Professional Graphics are very profitable for NV:

Professional Graphics = 30% of stock price
Desktop Discrete GPU = 18%
Mobile Discrete GPU = 15%

Considering $400-500 GTX670/680 are likely to make up just 4% of the Desktop Discrete segment based on historically trends, that means 4% of 18%, or just 0.72% of the entire stock price. In other words, Kepler architecture is very important for many of NV's GPU products, but GTX670/680/690 are used more as a marketing exercise for the Kepler image/brand than actually making $. If more gamers starting buying $400-500 GPUs, then it may be different in the future.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Percentage is lower but the revenue and margin potential is higher. The key is how much revenue and margins do the enthusiast class products bring in for nVidia?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
When over-clocking the HD-7970's will use significant power as many of the Factory OC sku's have showed with voltage adjustments.

I finally found a review that overclocked HD7970s using stock voltage! I can't believe I missed that detail before. ;)

Here is Asus Direct CUII 670 TOP OC:

201 Watt Max
IMG0036777.gif


Same website Asus Direct CUII 7970:

1000 / 1400 MHz @ 1.050V : 163W
1000 / 1400 MHz @ 1.174V : 208W (par défaut)
1150 / 1775 MHz @ 1.174V : 237W

36W difference between an 1150mhz HD7970 on stock voltage and an overclocked GTX670 on stock voltage. I don't think that's a big deal.

The MSI Lightning HD7970 did even better:

1070 / 1400 MHz @ 1.150V : 202W
1070 / 1400 MHz @ 1.174V : 211W (par défaut)
1100 / 1775 MHz @ 1.174V : 225W (This is still faster than 7970 GE)
1150 / 1775 MHz @ 1.225V : 256W

It's really when voltage is increased from 1.174/1.175 to 1.256V+ (Official GE Bios) that HD7970 becomes very inefficient. Running HD7970 @ 1.256V to get 1100-1150mhz is not practical.

Of course stock 670/680 cards are still a lot more efficient. Overvolting those cards would be no different than overvolting any other card, with power consumption scaling exponentially.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Indeed! However, one may achieve higher clocking with the volts and this may reach an OC GTX -680 and actually surpass an OC GTX 680. The watts with adding volts certainly adds significant wattage, but the HD 7970 has this flexibility; the end-user can decide if it's worth the trade-off. Adding volts is a strength for the sku to me.

Power requirements to me is more-so on the bottom of the important totem pole for an enthusiast class product.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Of course stock 670/680 cards are still a lot more efficient. Overvolting those cards would be no different than overvolting any other card, with power consumption scaling exponentially.

Which is why I bought a stock 680, to run it at stock. Power draw, heat, noise are all very good when run at stock and it's fast enough that I have no reason to OC.

For this generation you need to OC the Radeons to match the performance of the nvidias, and power usage is bad even at stock.

My Radeon 6850 was a great card, but this generation the Radeons are as power-mad as the first-generation GTX 5xx cards. Hopefully there will be refinements to improve on that, like nvidia managed for the GTX 560.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
My Radeon 6850 was a great card, but this generation the Radeons are as power-mad as the first-generation GTX 5xx cards. Hopefully there will be refinements to improve on that, like nvidia managed for the GTX 560.
Power mad? You.. guys are maaaad :p
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Power mad video cards (Dave) and torture chamber coolers (toyota) are now my new favorite terms :D Maybe Ryan Smith can incorporate these terms in his reviews as well.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I think he meant base clock, not necessarily boost.
Yes. A 1100Mhz base clock would be as high as most GTX 680s boost to today. So a card with a 1100MHz base would be faster than today's GTX 680s, not 15% slower.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Power mad video cards (Dave) and torture chamber coolers (toyota) are now my new favorite terms :D Maybe Ryan Smith can incorporate these terms in his reviews as well.

Now a 225W GPU is "power mad" and 40W of extra power consumption is "significant" but CPU overclocking/overvolting is perfectly fine. :hmm:

What's next, getting a $500 GPU and dropping factory clocks 50% to save 50W?

power-2.png

power-2.png


I guess no one has kids and uses a laundry dryer (1800-5000W) 2-3x a week or no one has a wife who uses a blow dryer every morning (1200–1875W)?

but this generation the Radeons are as power-mad as the first-generation GTX 5xx cards.

Only 56W of power separates GTX670/680/HD6970/7970/7970GE/GTX560Ti/GTX570/580 on a typical modern system that already uses about 310-320W of power.

In other words, there is an 18% power consumption difference between a GTX670 system and a GTX580 system (the most power hungry single-GPU in that chart). Seriously, that's a big deal in 2012? BTW, there is only a 35-40W difference between the 7970 GE (with its 1.256V overvolt BIOS that we already discussed as irrelevant) and a stock GTX680. :D
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Now a 225W GPU is "power mad" and 40W of extra power consumption is "significant" but CPU overclocking/overvolting is perfectly fine. :hmm:

What's next, getting a $500 GPU and dropping factory clocks 50% to save 50W?

power-2.png

power-2.png


I guess no one has kids and uses a laundry dryer (1800-5000W) 2-3x a week or no one has a wife who uses a blow dryer every morning (1200–1875W)?

Only 56W of power separates GTX670/680/HD6970/7970/7970GE/GTX560Ti/GTX570/580 on a typical modern system that already uses about 310-320W of power.

In other words, there is only a 15% power consumption difference between a GTX670 system and a GTX580 system (most power hungry single GPU in that chart). Seriously, that's a big deal in 2012? BTW, there is only a 35-40W difference between the 7970 GE (with its 1.256V overvolt BIOS that we already discussed as irrelevant) and a stock GTX680. :D

The context of significant included comparisons with performance of a GTX 680 OC. To achieve similar performance with a GTX 680 OC, over-volts may indeed be needed and the difference in watts is significant.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The context of significant included comparisons with performance of a GTX 680 OC. To achieve similar performance with a GTX 680 OC, over-volts may indeed be needed and the difference in watts is significant.

Yes, I understand that. It's been addressed already. HD7970 @ 1165 mhz trades blows with GTX680 @ 1290mhz (Xbitlabs review).

HD7970 @ 1150mhz 1.175V consumes 237W. Ok now find evidence that a GTX680 @ 1290mhz consumes less than 200W of power after I showed you that Asus GTX670 TOP OCed consumes 201W of power. So again we are back to 35-40W of power difference with OCed vs. OCed. Just because most reviewers are too lazy and attempt find max overclocks on a 7970 at 1.25-1.3V, doesn't mean real gamers are running their 1150-1200mhz 7970s at 1.25V.

It is true that at 1.25-1.3V, an overvolted 7970 starts to consume a lot more power. However, at that point you'll need to start talking about 1250-1280mhz overclocks and not 1150-1165mhz ones. At those clocks, it's no longer apples-to-apples since it beats an overclocked 680 (as shown by KitGuru).

Sounds to me like the penalty of a stock voltage overclocked HD7970 that provides similar performance to an overclocked 680 is around ~40W.
 
Last edited: