Geforce GTX 680 classified power consumption with overvoltage revealed, shocking.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
50W of power consumption is 50W less heat being created, 50W less heat needing to be moved, and at times 50W+ less A/C needing to be applied to balance out the heat. It does make a difference, which is why overvolting a video card for a few more MHz has always struck me as not being worth the trouble.

I totally understand what you are saying. But then wouldn't the same gamers who care about 50W of extra power run their CPUs at stock voltage/stock settings too? That's where the inconsistency comes in. Some gamers with heavily overclocked i5/i7 systems continue to bring the power consumption differences of GPUs into play, which is odd to say the least. I am sure if NV released a hypothetical 250-275W Big Kepler with 50% more performance, a lot of us wouldn't even care about its power consumption. Also, GTX680 does cost more $, and unless you game 8 hours a day, it'll take a long time to break-even on that 50W of power consumption difference in A/C costs (well actually impossible since HD7970 makes $60-70 a month in bitcoin mining after electricity costs).

I am on the same page with you. I try to overclock on stock volts/minimal voltage increase since that additional 50-60mhz increase usually costs 40-50W of more power.

P.S. At the end of the day, with how cheap electricity costs are in North America, with bitcoin mining on the side, upgrading AMD cards has become nearly free. So really, now it's more like paying $400-500 for an NV card vs. almost nothing for a new $500 AMD card. For the first time since X1950 series, the AMD card is not any slower either. At the beginning it may not be a big deal, but 2-3 upgrades later and suddenly you are looking at > $1000 in savings from GPU upgrades that can be spent on games. Until bitcoin mining is profitable, this strategy pretty much guarantees minimal GPU upgrade costs on the AMD side. It's very very hard to buy a competing $500 product when the other alternative is nearly free!
 
Last edited:

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
It has never been. Anytime you push more than +5% over stock voltage on any piece of silicon, perf/w goes right out the window.

I'm talking about from only a performance stand point. An extra 0.1V only gave them 100mhz more. My old 470 gained 250-300mhz with a voltage increase. Looking at percentages is even more incredible. ~8% with the 680, ~30-45% with the 470. You are very correct though, voltage increases have always come with much more heat and power usage, not to mention the stability problems. It's also not surprising that the 680 uses a lot of power when pushed beyond its intended voltage range...
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I totally understand what you are saying. But then wouldn't the same gamers who care about 50W of extra power run their CPUs at stock voltage/stock settings too? That's where the inconsistency comes in. Some gamers with heavily overclocked i5/i7 systems continue to bring the power consumption differences of GPUs into play, which is odd to say the least. I am sure if NV released a hypothetical 250-275W Big Kepler with 50% more performance, a lot of us wouldn't even care about its power consumption. Also, GTX680 does cost more $, and unless you game 8 hours a day, it'll take a long time to break-even on that 50W of power consumption difference in A/C costs (well actually impossible since HD7970 makes $60-70 a month in bitcoin mining after electricity costs).

I am on the same page with you. I try to overclock on stock volts/minimal voltage increase since that additional 50-60mhz increase usually costs 40-50W of more power.

P.S. At the end of the day, with how cheap electricity costs are in North America, with bitcoin mining on the side, upgrading AMD cards has become nearly free. So really, now it's more like paying $400-500 for an NV card vs. almost nothing for a new $500 AMD card. For the first time since X1950 series, the AMD card is not any slower either. At the beginning it may not be a big deal, but 2-3 upgrades later and suddenly you are looking at > $1000 in savings from GPU upgrades that can be spent on games. Until bitcoin mining is profitable, this strategy pretty much guarantees minimal GPU upgrade costs on the AMD side. It's very very hard to buy a competing $500 product when the other alternative is nearly free!

Who are these gamers you're clamoring about? Too much generalization.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Who are these gamers you're clamoring about? Too much generalization.

Really? I dare say most people in this very forum overclock their CPUs, and an overclocked CPU certainly adds as much or more power draw and heat as an overclocked GPU. Further, even at most gaming websites I frequent MOST of the users there are somewhat savvy and also slightly overclock their CPUs/GPUs - it is pretty common place. Overclocking a CPU isn't free just like overclocking a GPU isn't free.

Nvidia implementing these voltage lock mechanisms is a step backwards. If you don't like overclocking, that is fine but adding these limitations to factory overclocked boards such as the lightning or classified is quite stupid. Many people don't care about power consumption and will happily take another 100mhz for 100-200 more mV if it adds 30-40W to their power consumption. Who cares. I don't.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Where are the gamers that are inconsistent though? Power efficiency for an enthusiast class GPU is not the most important aspect to me and on the bottom of the important totem pole but 50 watts is not a non-issue as well on the same node.

It still is important over-all and may be higher on one's list than me. Higher watts create more heat -- cooler GPU's, less wattage, lower acoustics -- more balance and extremely compelling.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Who are these gamers you're clamoring about? Too much generalization.

Gamers with tri SLI, dual power supply, highly overclocked processor rigs who whine all over VC&G about power usage when it slightly benefits their home team :rolleyes:

That's just an example, of course :sneaky:
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I'm talking about from only a performance stand point. An extra 0.1V only gave them 100mhz more. My old 470 gained 250-300mhz with a voltage increase. Looking at percentages is even more incredible. ~8% with the 680, ~30-45% with the 470. You are very correct though, voltage increases have always come with much more heat and power usage, not to mention the stability problems. It's also not surprising that the 680 uses a lot of power when pushed beyond its intended voltage range...

True, but GTX 470 is a bad example, it was a seriously gimped GF100 chip. A GTX 480 easily ran at 800 Mhz at stock voltage, and a 470 being the exact same chip with a cluster disabled could do even a bit more.

The reason for Kepler being on such tight clock margins is that GK104 is already squeezed like crazy (and I don't see this ending up well in the long term in regards to failure rates). It was never meant to be clocked the way it is, it was meant to be sold as a GTX 660/670 with performance equal to GTX 580. There are even early GPU-z screenshots of samples running at around 700 Mhz base clock.

After nvidia saw the performance of Tahiti, they just decided to milk the cow. And why shouldn't they since their main objective is profit. Selling GK104 as a premium GeForce and saving GK100 for the 4-digit priced HPC cards (and giving themselves more time to revise it to GK110) is the smartest move the company could have done.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Gamers with tri SLI, dual power supply, highly overclocked processor rigs who whine all over VC&G about power usage when it slightly benefits their home team :rolleyes:

That's just an example, of course :sneaky:

Would you care to find links for this -- some examples. Would like to see how one defines this and whining?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Efficiency makes a lot of sense for multi-GPU -- less heat -- less noise -- less watts -- it's the reason why nVidia could create the GTX 690.

One of the impressive aspects of the HD 58XX series from AMD was its balance and efficiency and in a lot of ways, this Kepler generation reminds me of this.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I'm kinda with you halfway on this SirPauly. If nvidia makes a super efficient chip at stock settings, hey thats great. Obviously while power consumption doesn't matter for a discrete GPU (to me anyway) noise and heat do matter to most people so they are indirectly related.

What I don't like is that this has come at the expense of enthusiast overclocking features, perhaps some of the efficiency was artificially generated from eliminating over-volting features that were common place on prior architecture cards. Maybe nvidia can create different SKUs that differentiate between OC/OV friendly and non overclock friendly parts (sorta like the -K processors) or better yet - get rid of voltage locks in future products.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
OC vs OC with unlocked voltage

  • Power: +53W
  • Noise: +4.8dB
  • Temp: +7C
  • Performance gain: 2.66% (without Shogun fluke; 1.51% otherwise)

I'm not sure what your emphasis is with this statement, I don't care about the additional power draw. The additional noise, well the classified 680 uses a blower fan so you can't expect miracles there. That is specific to that product and not indicative of what other aftermarket coolers can do. If you're emphasizing the slight performance increase while OC/OV,

This has already been explained. The performance gain from overclocking with the classified isn't fully realized becuase while you can overvolt the card with EV Bot, the card still has kepler throttle and overcurrent protection. So what happens is, when the temperatures go over threshold (70C) the card will rapidly start throttling the core clock in 13 mhz increments until it is at 70C. As you can see from the review, it does hit over 80C while OC+OV'ed so the performance is being crippled by kepler throttle.

The lightning MSI card does not do this, they wisely implemented an unlocked BIOS that removes kepler throttle and also removes overcurrent protection. IMHO EVGA did not deisgn the classified as well as they could have, especially since it costs 660$. So this is entirely specific to the classified card.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I see...thx.
Then just add a bit on everything listed above :)

For example:

  • Power: +80W
  • Noise: +5.5dB
  • Temp: +9C
  • Performance gain: 6%
 

utahraptor

Golden Member
Apr 26, 2004
1,078
282
136
And to make things even more fun I believe they limited the fan speed to 35% to ensure throttling would occur.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
This has already been explained. The performance gain from overclocking with the classified isn't fully realized becuase while you can overvolt the card with EV Bot, the card still has kepler throttle and overcurrent protection. So what happens is, when the temperatures go over threshold (70C) the card will rapidly start throttling the core clock in 13 mhz increments until it is at 70C. As you can see from the review, it does hit over 80C while OC+OV'ed so the performance is being crippled by kepler throttle.
Blackend, you may want to double-check your info. GK104 does not "throttle the core clock in 13 Mhz increments until it is at 70C". There is a one bin (13Mhz) drop at 70C to compensate for increased leakage at higher temperatures. It doesn't drop any further than that due to temperatures. Any further pullbacks would be due to the card exceeding its power target.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Whatever the case, the OC/OV GTX 680 classified is utterly worthless with additional voltage because the clockspeeds scale back with overvoltage - there are users at various forums reporting that 1400mhz can't be obtained reliably (on the classified) because of a combination of overcurrent protection and thermal throttling. One user noted that while 1375mhz can be easily obtained, 100% load will cause clockspeeds to drop nearly 60-80mhz due to overcurrent/thermal throttling.

These features are great for reference boards in which users don't overclock, but are utterly worthless in aftermarket cards (which I remind, are SPECIFICALLY DESIGNED FOR OVERCLOCKING) I applaud MSI for removing these draconic features on their MSI lightning product, and hope nvidia doesn't make such boneheaded design decisions in the future. Enthusiasts enjoy overclocking, and nvidia is taking extra steps to make life difficult.

Whats even more hilarious is that users putting their 680s under water CANT get better overclocks...how stupid is that? Whats the point? I get it, stock performance on the 680s is great but for those that like overclocking there should be a separate SKU specifically for overclocking (like intel -k processors) OR this crap should be removed from aftermarket cooled cards. Intel recognizes that users enjoy overclocking, and i'm sure nvidia is aware that enthusiasts enjoy overclocking as well. And that is why voltage lock / kepler throttle HAS TO GO.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I totally understand what you are saying. But then wouldn't the same gamers who care about 50W of extra power run their CPUs at stock voltage/stock settings too? That's where the inconsistency comes in. Some gamers with heavily overclocked i5/i7 systems continue to bring the power consumption differences of GPUs into play, which is odd to say the least. I am sure if NV released a hypothetical 250-275W Big Kepler with 50% more performance, a lot of us wouldn't even care about its power consumption. Also, GTX680 does cost more $, and unless you game 8 hours a day, it'll take a long time to break-even on that 50W of power consumption difference in A/C costs (well actually impossible since HD7970 makes $60-70 a month in bitcoin mining after electricity costs).

I am on the same page with you. I try to overclock on stock volts/minimal voltage increase since that additional 50-60mhz increase usually costs 40-50W of more power.

P.S. At the end of the day, with how cheap electricity costs are in North America, with bitcoin mining on the side, upgrading AMD cards has become nearly free. So really, now it's more like paying $400-500 for an NV card vs. almost nothing for a new $500 AMD card. For the first time since X1950 series, the AMD card is not any slower either. At the beginning it may not be a big deal, but 2-3 upgrades later and suddenly you are looking at > $1000 in savings from GPU upgrades that can be spent on games. Until bitcoin mining is profitable, this strategy pretty much guarantees minimal GPU upgrade costs on the AMD side. It's very very hard to buy a competing $500 product when the other alternative is nearly free!

I'm not sure if it's CPU's or just Intel, but they are held to an entirely different standard. "10% more performance for the same or higher price? Awesome!" Where with GPU's it's "30% more for the same price? Or, same performance for 40% less cost? Bloody rip off artists! I want 2x the performance for the same money or I'm going postal on them!"

Also, in most cases that O/C'd cpu isn't going to improve your gaming experience. An O/C'd 2500 vs. a stock one is typically only a single digit improvement in gaming performance.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That pie chart was compiled pre-GTX 460 launch from Steam data, and my guess is that it deals with volume, not value.
Margins... Nvidia was making $60 on each 550Ti sold, but $270 on GTX 580.
Also GTX 470/480 were not huge sellers, but 670/680 are topping Steam charts right now.

I don't think that's how it works. Maybe someone with more insight on this can comment. IIRC, NV sells the actual chips to AIBs and then they provide a suggested MSRP for guidance. The AIBs then assemble the cards (cooling, memory, PCB, etc.) based on NV's for reference design. Later on, NV may allow them to release non-reference designs so that they can differentiate from one another (EVGA Classified, MSI Lightning, etc). NV doesn't actually make the GTX580 or GTX680 and then sells an assembled card for $270 to EVGA and then EVGA goes and adds a $230 mark-up on top of that. If I were to guess, NV likely sells the GK104 chip for say $90-120 to the AIBs and then from that point on all the packaging, PCB, cooling, memory costs, retailer/advertising/social media costs and whatever profits AIBs intend to make is why the cards end up at $500 in retail.

I do not believe that adding an additional memory controller to a chip is non-trivial. I know of no existing architecture or chip with that kind of overhaul having took place.

Sorry if my comment was confusing. I wasn't saying that NV has to start with GK104 to make a 2000SP/320-384-bit card. Just a reference in general that because GK104 is just 294mm^2, they have room to release a much larger chip, say based on the successor of GF110. AMD on the other hand already has a pretty large chip and they don't really like to make 400+mm^2 dies. I think from that point of view, NV has a ton of room left if they wanted to, while AMD has a much tougher task. I am not saying that NV will for sure make a 500mm^2 28nm Kepler chip for the consumer market, but they could if they wanted to. AMD on the other hand is not going to pull something like that off. Given Kepler's excellent power efficiency per transistor, great gaming performance with just 192-bit memory bus, I think they don't need to sweat much for HD8000 vs. GTX700 series. AMD otoh will have to pull off some serious magic to rebalance the Tahiti XT chip.

Bitcoin, F@H, OpenCL accelerated blah blah etc all that stuff that everyday users consider "compute" aren't really "compute" imho since its barely using the resources available at hand (it'd be like driving a fast car to grab your groceries 2 blocks down). I can tell you that with first hand experience, its very challenging to take advantage of the hardware available via programming for general purpose tasks because of the limited tools/information/interface between the software and hardware being scarce/rudimentary etc.

Bitcoin pegs GPU usage at 99%. MilkyWay@Home gets 235,000 BOINC points, while HD7970 with an overclock exceeds 350,000 points per day. That blows away the inefficient coding of Folding@Home that's still using outdated programming. I think a GTX680 would be lucky enough to get 25,000-30,000 BOINC pionts in F@H per day. Sure, this might not matter to 99% of users, but to say that compute is barely using the resources available is not entirely true. There are programs that already take advantage of compute capability, but they are just not for the mainstream market (i.e., you can't really use Direct Compute yet to accelerate Lame MP3 encoding for example or convert MKV movie to H.264 onto your iPad 3 a lot quicker).

Bitcoin isn't difficult to set up btw and it works. So again, the strongest argument for compute is bitcoin right now for us "mainstream" users since it actually makes $ on the side that can be funnelled to paypal, or converted to Amazon/Newegg gift cards. That's a win-win for gamers. With GCN, you get a gaming card that let's you do bitcoin hashing math calculations, which in turn pays for the next GPU upgrade, which in turn increases gaming performance, etc.

WinZip 16.5 has OpenCL extensions and when paired with an HD7900 card, the performance is around 2x faster than a $1000 Intel 3960X

winzip.png


GPU Compute may or may not take off on the desktop but it's a start.

A Gigabyte 670 Windforce would be 3410 after rebate (same), but PCI-E 2.0 x8 SLI would hamper performance (if PCI-E 3.0 indeed doesn't work).

PCIe 2.0 vs. 3.0. The performance difference is very minor even with dual-GPUs. It'll be there, but won't be enough to actually impact settings in games.

HardOCP just investigated this in detail.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I totally understand what you are saying. But then wouldn't the same gamers who care about 50W of extra power run their CPUs at stock voltage/stock settings too? That's where the inconsistency comes in. Some gamers with heavily overclocked i5/i7 systems continue to bring the power consumption differences of GPUs into play, which is odd to say the least. I am sure if NV released a hypothetical 250-275W Big Kepler with 50% more performance, a lot of us wouldn't even care about its power consumption. Also, GTX680 does cost more $, and unless you game 8 hours a day, it'll take a long time to break-even on that 50W of power consumption difference in A/C costs (well actually impossible since HD7970 makes $60-70 a month in bitcoin mining after electricity costs).
Good point. At least in my mind, the gains from overclocking a high-end CPU are far better than overclocking a high-end GPU. A 3.3GHz SNB can regularly hit 4.5GHz with a bit more voltage and better cooling. Even after factoring out Turbo, that's a huge performance increase. And yet because TDP was only 95W in the first place, power consumption hasn't necessarily hit the roof and it's still practical to move that much heat quietly.

This is as compared to high-end GPUs, where they regularly have TDPs in the 200-300W range. GPUs (specifically those that aren't 2nd tier like the 670/7950) don't overclock nearly as well as a CPU, and if you overvolt it's easy to add a lot of power consumption in the process. The gains aren't nearly as great for the extra effort incurred. I'll chase more performance, but the heat generated means I do have a limit.

P.S. At the end of the day, with how cheap electricity costs are in North America
Indeed. I'm only paying something like $.07/KWh, which is incredibly cheap. So at least here the problem with power consumption isn't the cost of the electricity (or even running the A/C), but rather the overall heat and acoustics. Even though I have central A/C I still have to run a secondary A/C in my office on hot days because of my equipment. Portable A/Cs are neither quiet nor cheap, so while the latter is a sunk cost, if I can keep heat generation down and avoid listening to the A/C I'm all for it.:p

I'm not sure if it's CPU's or just Intel, but they are held to an entirely different standard. "10% more performance for the same or higher price? Awesome!" Where with GPU's it's "30% more for the same price? Or, same performance for 40% less cost? Bloody rip off artists! I want 2x the performance for the same money or I'm going postal on them!"

Also, in most cases that O/C'd cpu isn't going to improve your gaming experience. An O/C'd 2500 vs. a stock one is typically only a single digit improvement in gaming performance.
It's the difference between graphics and serial workloads. Rendering is embarrassingly parallel - add more functional units (shaders, ROPs, etc) and performance will increase in a fairly linear fashion. So every node shrink should bring with it around a 50% performance improvement for the same die/power.

This is as compared to serial workloads, which can't be distributed across additional functional units like that. The only solution is higher IPC and higher clockspeeds, and we've reached a point where both are difficult to significantly increase right now.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What kind of revenue can one derive with Bitcoin when one counts the added costs of electricity?

Edit: Found a nice article here:

http://www.pcper.com/reviews/Graphi...Update-Power-Usage-Costs-Across-United-States

I am going to read that article in detail, but there is a MUCH simpler calculator in place.

http://bitcoinx.com/profit/

For example: HD7970 @ 1150mhz gives 680 MHash/sec rate, let's assume 250W of power (but it's less since the memory can be downclocked since it's not used for this task).

Electricity cost: $0.15 per kWh

Revenue per month: $94.83
Cost of hardware: $0 from bitcoins generated by the previous AMD card + resale value of the old card
Less Electricity cost: $27.39
Net profit per 1 month: $67.44

Let's say you game 4 hours a day ==> $67 * 20/24 = $55. In 6 months = $330 towards your next GPU upgrade + resale value of your current AMD card. That's a free upgrade towards HD8970*, etc.
*Assuming Butterfly Labs new ASIC products don't destroy GPU mining.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Or I could sell

sp_0501_04_m4.jpg


for $1.8/day and use my PC for whatever the heck I want :)

@Russian

Nvidia builds some of it's GPUs themselves. But that, or how the money flows between AIBs and NV was not my point(not that I know it anyway).
What I wanted to say is that discrete high-end market is not that tiny as usually perceived.
High-end margins are HUGE, and the lower you go - the slimmer they get.

Have a look:

q2msa.gif
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Nvidia builds some of it's GPUs themselves. But that, or how the money flows between AIBs and NV was not my point(not that I know it anyway).

Well from the chart for GTX580, I see now how you got that $270. Total Manufacturer Price - Total Cost = $482 - $210 = 272.

However, NV sells the $120 GPU and that's it. For example, it might cost them $80 to manufacture (+ selling/marketing costs to AIBs) the GPU and they get Gross Margins of ~ 50% by selling the chips to AIBs for $120. They don't sell a $210 GTX580 for $480.

What I wanted to say is that discrete high-end market is not that tiny as usually perceived.

Your own data supports that it's tiny. Using your data:

Total desktop GPUs for that quarter = 6,550 ATI + 9,520 NV = 16,070 (000s)
$300+ desktop GPUs for that quarter = 213 ATI + 578 NV = 791 (000s)

So desktop GPUs for that quarter $300+ are just 791 / 16,070 = 4.9% of the entire desktop discrete GPU market.

Since we are talking about $400-$500 GTX670/680/7970 cards, it'll be smaller than that since it would even exclude GPUs in the $300-399 range. So really we are back to that 3.5-4%. That's rather small.

High-end margins are HUGE, and the lower you go - the slimmer they get.

Right, but compared to Professional Graphics segment, it's still nothing. Also, when you are talking about how much cash flow a company makes, you have to look at the volume of units sold and not just margins. In the grand scheme of the entire company, GTX670/680 and even HD7970 don't make much $ since their volume is so small.

Think about it, even if NV sells 400,000 GTX670s this quarter, they sold 150,000 K10/K20 Tesla pre-orders (and those cards go for $2-5K, which means NV probably makes multiple X the amount of $ on each).
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
I find it quite bizarre that the 570 apparently has more expensive power circuitry than the 580, despite having less and being notorious for blowing up. I'm sure it's just a typo, but it's somewhat humorous.

Also, assuming that chart is correct, it's crazy how cheap they were dumping the 68xx series for.
 
Last edited: