Geforce GTX 680 classified power consumption with overvoltage revealed, shocking.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrMuppet

Senior member
Jun 26, 2012
474
0
0
The cooler on the Dual-X is very good even at 1150mhz clocks, nevermind stock 950mhz. If you are not going to overclock your card, your stock DX 7970 should run about 15-20*C cooler than the reference GTX670, and way quieter than either the reference 670 or the even louder Palit JetStream cards. Did you actually open the 7970 card to test this out in your closet? I don't understand the logic of getting a GTX670 and then buying an after market cooler to make it quieter than the already quiet and overclockable Dual-X card you bought.
Unfortunately I can't return the 7970 DX if I break the seal. As you say, I'm sure the 7970 DX would be quiet enough in the closet. I had hoped the GTX 670 ref would be, but alas it's not (it's bearable at idle with the door closed, but not at load, unfortunately - I'm hypersensitive).

Here followeth the logic: :)
A. The 670 performs better in the games I already have and know I'll play (BF3 and Skyrim) and probably will buy (SC2).
B. With Kepler 2-way SLI suddenly became an option (due to negligible micro stuttering).
C. The 670 will dump less heat in the closet (especially in SLI compared to CF).
D. The difference in cost (after deductions) is insignificant enough that I don't really care. (If I was a bit more cost-sensitive I would've gone for a 7950 and 670 SLI if I was much less.)

A+B+C+D = Leaning towards the 670. However, it's not a clear-cut decision.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Sorry, but did you actually read my posts in detail?

I just told you twice now that HD7970 can do 1050mhz on stock voltage and as a response you linked me a review where the testers took a reference HD7970 and manually pumped the core voltage to 1.256 volts. It says it right there in the review.

1) Increasing voltages from 1.175 to 1.256V increases power on 7970 by 40-42W without any increase in clock speed on my card. Why would any HD7970 owner do this if they just want to run the card at 1050mhz clocks? Even getting to 1150mhz can often be done with 1.175V (1.2V in MSI Afterburner due to Vdroop on some cards).

2) What is the point of discussing the jet engine "torture chamber" HD7970 GE cooler when plenty of $450 after-market 7970s are for sale with great coolers?

sapphiredualxhd7970115g.jpg


BTW, the reviews on the Egg for the Classified are pretty bad. A lot of the users can't even get past 1300mhz on this $660 card. Considering a lot of $500 GTX680s can hit 1240-1260mhz on air, this card getting 1270-1280mhz for $660 is a dud. $80 more to unlock voltage is just an insulting $ grab, especially when the stock cooler lets the card exceed 70*C at that point and starts throttling GPU Boost.

Indeed! Proves that the ghz edition cores are more efficient based on computerbase's findings, and why I added this to the discussion since you're raising the ghz edition cores. When over-clocking the HD-7970's will use significant power as many of the Factory OC sku's have showed with voltage adjustments. With a GTX 680 at 1240-1260 --- it may take HD 7970 higher clocks and voltage adjustments to match performance.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
However, it's not a clear-cut decision.

It's pretty easy to me based on the info you provided. I'd pick SLI over CF based on everything I read and from user feedback. Also, Blizzard games and BF3 runs really well on Kepler cards. Sounds like GTX670 SLI is the way to go. Maybe you can just sell the 670 and return the DX 7970 and get Gigabyte Windforce / Asus DCUII 670 to start and add a 2nd one later. Both are very quiet, especially the Asus. That sounds like it hits all the points you want - quiet, power efficient and lower micro-stutter later on should you go SLI.

Indeed! Proves that the ghz edition cores are more efficient based on computerbase's findings, and why I added this to the discussion since you're raising the ghz cores. When over-clocking the HD-7970's will use significant power as many of the Factory OC sku's have showed with voltage adjustments. With a GTX 680 at 1240-1260 --- it may take HD 7970 higher clocks and voltage adjustments to match performance.

OK but you don't need to adjust voltage on after-market 7970s to get 1150mhz. Also, there is a $50-70 price difference too when comparing quiet 680 vs. quiet 7970. As far as I am concerned after 670 was released, GTX680 and reference 7970s have been made redundant. I am sure many people who got 680 didn't think 670 was going to be that good and if they could go back in time would prob choose to save $80-100 and grab an aftermarket 670 instead. In HardOCP's review the Asus DirectCUII 670 even outperformed the stock 680 in every metric imaginable from noise to power consumption to build quality to performance/overclocking.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I like the ghz edition sku's based on the potential of AIB differentiation and the efficiency gains with the core. It's not a huge step forward but still a step forward that brings the performance crown and more awareness for AMD.
 

MrMuppet

Senior member
Jun 26, 2012
474
0
0
It's pretty easy to me based on the info you provided. I'd pick SLI over CF based on everything I read and from user feedback. Also, Blizzard games and BF3 runs really well on Kepler cards. Sounds like GTX670 SLI is the way to go. Maybe you can just sell the 670 and return the DX 7970 and get Gigabyte Windforce / Asus DCUII 670 to start and add a 2nd one later. Both are very quiet, especially the Asus. That sounds like it hits all the points you want - quiet, power efficient and lower micro-stutter later on should you go SLI.
Yes, the only dents are that I want to be able to try other games (including finding out if my rig plays Crysis ofc) and that when 670 SLI is truly needed 2GB may no longer be enough (but by then a single 7970 3GB probably wouldn't be either).

You think twin Twin Turbo II won't fit in my Z77 Extreme4? Hmm. You may be right and even if it works it would be a (danger) close fit.

That Windforce is an interesting suggestion, if it's quiet even at load and works. After rebates a GTX 670 Windforce would cost about as much as my 670 ref did plus what a Twin Turbo II does, so I'd break even. This talk about Gigabyte GTX 6x0 cards not working in SLI @ PCI-E 3.0 worries me though: http://forums.anandtech.com/showpost.php?p=33716090&postcount=6 I've heard it elsewhere as well.
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
MrMuppet: Been playing COD MW3 in 5760x1080 resolution all night without a hiccup with my new EVGA gtx 670 ftws in SLI. I ran BF3 and it is darned fast with this setup.
 

KompuKare

Golden Member
Jul 28, 2009
1,235
1,610
136
Agree. I think GPU Boost is here to stay though. Both AMD and NV are using it now. Hopefully NV can figure out a way to keep the GPU Boost but allow manual voltage control and the ability to keep GPU clocks fixed (say in manual/advanced mode). Part of me thinks they did this because their reference 670/680 cards are built to meet the bare minimum spec (4 VRM power phases / dinky GTX670 reference PCB). I think they built their cards a lot cheaper than GTX480/580 and probably were afraid of blown VRMs/power circuitry that was common with OCed 590s.

+1. That was exactly my first though when reading this thread and seeing how much power an OC/OV 680 pulls. Nvidia's electrical engineering isn't that good at the best of times, but GK104 reference designs are pretty under-engineered and the voltage lock seems essential to stop those cards destroying themselves. Plenty of margin on those cards though so they must be happy.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
Why assume Tahiti can be run more efficiently when it's not implemented by AMD ? Maybe the AMD engineers know something ?
Because AMD engineers are smart people, they would rather have a video card that is fed enough power, for stability... and overclocking reasons. Then it's the end user that should tweak it (if he cares about power, for example = most people do not), tailored for his/her specific needs. At least you have the option of manually tweaking the voltage, unlike with Kepler ;-)

Most AMD CPU/GPU products have come "overvolted" lately in my experience. It certainly has changed from the past, when even the 10% OC would have forced you to bump some voltage.
 
Last edited:

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
MrMuppet I feel ya haha with regards to getting simple points across to people some times.. (referring to the earlier posts in this thread).

And those saying nVIDIA has bad "electrical engineering" or whatever is absolutely wrong. AMD on the otherhand seems to overkill their PCBs which to me is bad engineering since anyone can do that. Many forget "overclocking" is itself an act of running cards out of their specifications which the cards were built around. There are safety margins of course (even if it "looks" bare minimum for the untrained eye) because all PCB designers take these limits into account + further derating components to boot. But this is for "normal" operation and NEVER EVER does it take overclocking into account or else they would be running at those specs in the first place. And from my knowledge, video cards or any PC components don't have many protection circuitries built into them due to the cost/benefits not being there. Its only recently these power limiters are getting put into place as people are finding a way to control the VRM controllers to adjust the voltage at the whim (a very dangerous thing to do).

What nVIDIA has done is, they were able to get away with a PCB that's relatively simple, not so expensive to resource and have it able to accommodate/handle the GK104 chip without any issues. You have delivered everything within spec while reducing costs significantly down. It is good engineering practise in my books. We've yet to see cards failing. And incase people bring up the GTX590, from what I remember there were no failings at "stock" settings.

Anyone expecting products (this applies to all electronic products in general) to run out of spec perfectly fine is just pure nonsense. Its such a hard concept for many on the board to grasp because many just take it for granted that cards can overclock for free.. not only that but somehow the manufacturer is now liable for it because the card blew due to overclocking something that is clearly stated NOT under the warranty agreement.
 
Feb 19, 2009
10,457
10
76
<= 7950 @ 1.125ghz @ 1.087v

I have to agree with Russian on this one. Reviewers typically don't bother to tweak OCs to get the best performance possible or even focus on best efficiency. ie. perf/w, perf/noise/temp etc etc. But users who OC their cards certainly tweak it to fit their needs much better.

Likewise, with the low prices on 7950 (and its obvious huge OC potential), none of the cards above is is worth it as the price gap is too big for the negligable performance gains.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
When looking at anandtech's review of this card, needing to have a separate unit to increase voltage, because of Nvidia's rules, I doubt very much the MSI GTX 670 Power Edition will get software Voltage tuning in MSI Afterburner. If so, they are breaking Nvidia's guidlines.

This again would mean that MSI has done false advertising.

I can understand Nvidia's reason behind this. My GTX 570 died, most probably because of a little Voltage increase over time. It already ran more or less at TDP on stock.
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
I can understand Nvidia's reason behind this. My GTX 570 died, most probably because of a little Voltage increase over time.
That's one of the reasons why I despise high TDP cards, they are just not as reliable as the cooler, slower and much simpler running counterparts.

Simple is perfect.
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
And those saying nVIDIA has bad "electrical engineering" or whatever is absolutely wrong. AMD on the otherhand seems to overkill their PCBs which to me is bad engineering since anyone can do that.

I have not read the previous statements but I absolutely disagree with yours!

I am an engineer and I would not consider over-engineering to be "bad" engineering. How is a bigger safety margin a bad thing? I know what you mean by the way...the optimization is what you are saying...but maybe the AMD engineers are not asked to make their cards marginal, and to leave a bigger safety margin.

It may be "bad" for the company accountants but not bad for consumers. At that level, designing PCBs...none of them are "bad" engineers.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
That's one of the reasons why I despise high TDP cards, they are just not as reliable as the cooler, slower and much simpler running counterparts.

Simple is perfect.

I really like the fact that Kepler is using a Powertarget. This way it will ensure the card is not drawing too much power, stressing the card to it's limits.

I have stopped increasing Powertarget on my cards. I just let them do whatever they want at 100% powertarget. They will rarely downclock, and when they do it's by a tiny bit.

Edit: The Witcher 2 is amongst the more demanding games for GPU's. I use this game to stress my cards. I use a custom fan profile from MSI Afterburner. Other than that all is stock.

(Specs in sig.)

Upper card got to 70c downclocking 14MHz, and did downclock/Volt down some more because of the 100% powertarget limit. However, it did not once fall below 1084MHz, which is the same Kepler boost my lower card has. Upper card has a 1124MHz Kepler boost. So by using the Powerlimit one can stay pretty much at a resonable power draw/temp level, almost without loosing any performance. I don't see the point increasing Powerlevel to get 1FPS more while stressing the cards much more.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
When looking at anandtech's review of this card, needing to have a separate unit to increase voltage, because of Nvidia's rules, I doubt very much the MSI GTX 670 Power Edition will get software Voltage tuning in MSI Afterburner. If so, they are breaking Nvidia's guidlines.

This again would mean that MSI has done false advertising.

I can understand Nvidia's reason behind this. My GTX 570 died, most probably because of a little Voltage increase over time. It already ran more or less at TDP on stock.

Unwinder has already unlocked the voltage in software though. It only took two hours.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
MSI is too big a partner in Asia and Europe for Nvidia to do anything to prevent them I think. I tried searching for this info about software overvoltage being prohibited and the only ones who say that are EVGA. They also accuse MSI of "cheating" to get voltage adjustment.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
MSI is too big a partner in Asia and Europe for Nvidia to do anything to prevent them I think. I tried searching for this info about software overvoltage being prohibited and the only ones who say that are EVGA. They also accuse MSI of "cheating" to get voltage adjustment.

From the classified review

  1. Partners wishing to have a card with a base power target over 195W must use a custom PCB with suitable power circuitry. NVIDIA won&#8217;t allow partners to ship higher-power cards using the reference PCB.
  2. Software overvoltage control is forbidden.

Another thing... If MSI Afterburner will support Voltage adjustments for MSI GTX 670 Power Edition, will it support it for all other GTX 670's too? I've heard the MSI GTX 670 PE Powercircuitry and VRM controller is no different from other GTX 670's... So what do you think?

BTW: The Gigabyte GTX 670 Windforce 3x uses a GTX 680 reference design PCB, with one additional Power phase and 8+6 Pin PCI-E Power VS 6+6 pin. 5 VS 4 in the reference. This clearly go against what Nvidia said:

Partners wishing to have a card with a base power target over 195W must use a custom PCB with suitable power circuitry. NVIDIA won&#8217;t allow partners to ship higher-power cards using the reference PCB.
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
Imagine, the new "K" class of video cards by nVIDIA, for the people who wish to participate in overclocking competitions :p

First move had already been made, by locking voltage.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I read the article but there is zero info about this anywhere else I looked. I did a search for 2 hours last night.

EVGA forums with employees claiming MSI is cheating. This article with certain info that nobody else is claiming. I think EVGA is feeding everyone a load of BS.

Supposedly there's a chip on the GTX670/680 that doesn't allow access to the voltage adjustments. MSI changed that chip I hear. We will just have to wait and see, but honestly when your card throttles at 70c anyway there is absolutely no point in raising voltage and overclocking beyond what you can achieve at stock levels.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
<= 7950 @ 1.125ghz @ 1.087v
Likewise, with the low prices on 7950 (and its obvious huge OC potential), none of the cards above is is worth it as the price gap is too big for the negligable performance gains.

Now imagine that you can get 2x MSI TwinFrozr 7950 cards that will run cooler and quieter than the GTX680 Classified for less $ that it costs to get a GTX680 Classified without the EVBot. At 1.1ghz, I am pretty sure that an HD7950 is as fast as a GTX680. I feel like this card is a $ grab from EVGA. To get the features that they promise it's not just $160 over say a Gigabyte Windforce 680, but it's really $240 because you need the EVBot.

Maybe I am in the minority here but I'd rather get 2 HD7950s OCed for $660 over a single $660 GTX680 for actual real world headroom necessary for next generation games, instead of another 70-80mhz overclock marketing gimmicks that EVBot gives over any basic GTX680.

With NV you can now get ~ GTX690 performance for less than $650. Now, that I am impressed with. :thumbsup:


You think twin Twin Turbo II won't fit in my Z77 Extreme4? Hmm. You may be right and even if it works it would be a (danger) close fit.

It might barely fit, but the top card will have almost no room to get fresh air. Look how massive the TTII cooler is! If you are going for a quiet design, might as well spend extra on the Asus Direct CUII 670 with premium components too. How much is the DCUII vs. the Palit vs. the Windforce?

What nVIDIA has done is, they were able to get away with a PCB that's relatively simple, not so expensive to resource and have it able to accommodate/handle the GK104 chip without any issues. You have delivered everything within spec while reducing costs significantly down. It is good engineering practise in my books. We've yet to see cards failing. And incase people bring up the GTX590, from what I remember there were no failings at "stock" settings.

From NV's perspective, they've done a better job than AMD since they saved $ to make bare minimum VRM/PCB to work at stock settings. For us as gamers that's not optimal. Some of us overclock to get "free" performance. This matters even more for cards like 7950/7970 that overclock 30-35%+. I guess you can say that NV pushed their cards much closer to their max settings from the factory though, while AMD underclocked theirs too much. Also, if you do anything other than gaming where your GPU gets loaded to 99% and you overclock (Folding@Home, MilkyWay@Home, any GPGPU program), this starts to matter a lot more. In 6 months of stressful distributed computing/GPGPU, that could be the difference between a failed card and a working one (Not saying it's guaranteed to fail, but I wouldn't want to take a risk with a $400-500 GPU to do that). Having said that NV's partners did a great job releasing good quality parts and addressing the issues. The reference 680 is not bad but the reference 670 is a sad case for a $400 reference card.
 
Last edited:

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
I read the article but there is zero info about this anywhere else I looked. I did a search for 2 hours last night.

EVGA forums with employees claiming MSI is cheating. This article with certain info that nobody else is claiming. I think EVGA is feeding everyone a load of BS.

Supposedly there's a chip on the GTX670/680 that doesn't allow access to the voltage adjustments. MSI changed that chip I hear. We will just have to wait and see, but honestly when your card throttles at 70c anyway there is absolutely no point in raising voltage and overclocking beyond what you can achieve at stock levels.

I agree. I have no interest in adjusting higher voltages on my two ASUS GTX 670 Direct CU II cards. Upper card is already reaching 70c with a custom fanprofile, in the most demanding games.

Voltage adjustments will only be suitable for watercooled cards.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Unless someone comes up with a hacked bios or something to remove the thermal throttling. Maybe MSI cards don't throttle at 70c? I don't know.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
My 7970 isn't too bad on power use at my clocks, judging by benchs I've seen. My overvolted, overclocked Thuban, though... :(
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Unless someone comes up with a hacked bios or something to remove the thermal throttling. Maybe MSI cards don't throttle at 70c? I don't know.

Two points. First, the cooler on the lightning is very, very good with fans that can run at insane speeds if you wish, so temps will rarely if ever go in thermal throttle territory. Most users using the lightning report using 1.25V without passing 60C.

Secondly, there's a specific extreme BIOS on the lightning which removes thermal throttle in addition to OCP protection. So to answer the question, the Lightning does not throttle at 70c. I'm anxious to see how everything pans out, however I for one applaud MSI for giving customers more for less. EVGA has a big issue here - the classified is not very good without EV Bot so that leaves users with a delimma. Pay an absolutely outrageous amount of money for classified + EV bot, or take a ride on the silicon lottery and pray for a good overclock without voltage? I think MSI is taking the better approach.

I could really care less about what EVGA reps have to say, I don't take it at face value. Didn't they force EV Bot on prior cards as well?
 
Last edited: