What do you think of nVidia locking down voltage?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Does it bother you that nVidia has locked down the voltage on "Kepler" GPUs?

  • I don't care

  • It doesn't bother me at all

  • It bothers me a little

  • It bothers me a lot

  • I will no longer purchase nVidia products because of this

  • I don't overclock


Results are only viewable after voting.

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The only thing I wish for is an adequate explanation. As i've mentioned before, we're all well aware that this is completely within the rights of nvidia, yet users do not have to like it.

I have MSI lightning 680s which I love, I am still able to over volt them with some tricks. It does look odd when enthusiasts have over volted GPUs for several years (it was very very common with Fermi) and then suddenly the course is reversed. Like I said, I love my cards and will still enjoy them, heck I can get 1300 with no over voltage. But I would really love for an explanation to be provided, something better than "because nvidia said so". That's the only thing on my wish list.

Is it because of warranty issues? Brand name? What? I would be much more at ease if someone came forward and said, "Hey guys this is why we're doing this" and then it would be cool. I mean I still would not like it, but I would understand it.

Here's the way I see it. A reference GTX 680 cannot handle over voltage well, we all know that the cooler and VRM is not up to the task. But why apply the same voltage methodology to a card like the MSI lightning or EVGA classified? These cards are specially equipped with hardware that is completely fine with handling the additional voltage! This is the main reason I do not understand the change. A reference with voltage locked? Yeah, thats fine with me. But why apply the same methodology with substantially better cooling and hardware?
 
Last edited:

BD231

Lifer
Feb 26, 2001
10,568
138
106
I definitely don't like it moving forward but if the results are anything like what you get from current 670's/680's it'd be in nvidia's best interest to keep those locks on. Call it what you want, crap engineering, financial move, ect, current 670's/680's get disgustingly inefficient when over volted so I think it's for the best this round.

I voted it bothers me.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The poll is how enthusiasts will feel (kinda, since again the OP is equating OCing and Overvolting, which is intellectually dishonest).

The real world doesn't operate like that. This is like an argument in the console or smart-phone forums over some function/feature that nobody really cares about, and all sides still sell a mST anyway.

nV has cash. AMD is losing badly in the GPU market on a whole, is always behind in the discreet GPU market, and is facing very large debt payments while its stock is toxic.

Honestly this sub-forum is such a fake picture of AMD's health. Please go read the CPU forum, where people don't just discuss OCing, etc, they discuss the companies as a whole.

Summing up your post: State the subject, attempt to make it irrelevant, change the subject, drag discussion off topic. Now, anything to say about nVidia locking down voltages?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
In the poll, what's the difference between: "I don't care" and "It doesn't bother me at all"? Hehe

Maybe those 2 options can be combined into 1 answer?
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
I over-clock, I do not over-volt.

The gains in over-volting are not noticeable in any game. If you have to over-clock so high to play a game that you need more voltage, you should probably turn down your settings.

Depends I guess. My 7970 has stock voltage of 1.05V which is good for 1050-1100Mhz game stable. With 1.225-1.25V I can hit 1325Mhz game stable. That 225-275Mhz difference is very apparent in Skyrim with mods and other GPU-intense games.

Overvolting isn't for everyone but I wouldn't say its benefits aren't noticeable. If Nvidia continues the voltage lock-down, I'll be less likely to consider them in the future.
 

Nelly

Member
Oct 17, 2009
27
0
66
People with cards that specify overvoltage on the box, they could most likely be able to return their cards via RMA if it bothers them that much, as it is clearly breaking false advertising law, not sure if this rule applys to anywhere outside europe.
confusedd.gif


It's quite a drastic measure lol
unsurel.gif
I'm not sure I would take it that far, mind you the MSI GTX 7970 Lightning is the equivelent of $124 cheaper in the UK, compared to the MSI GTX 680 Lightning...
 

bononos

Diamond Member
Aug 21, 2011
3,928
186
106
The only thing I wish for is an adequate explanation. As i've mentioned before, we're all well aware that this is completely within the rights of nvidia, yet users do not have to like it.

I have MSI lightning 680s which I love, I am still able to over volt them with some tricks. It does look odd when enthusiasts have over volted GPUs for several years (it was very very common with Fermi) and then suddenly the course is reversed. Like I said, I love my cards and will still enjoy them, heck I can get 1300 with no over voltage. But I would really love for an explanation to be provided, something better than "because nvidia said so". That's the only thing on my wish list.

Is it because of warranty issues? Brand name? What? I would be much more at ease if someone came forward and said, "Hey guys this is why we're doing this" and then it would be cool. I mean I still would not like it, but I would understand it.

Here's the way I see it. A reference GTX 680 cannot handle over voltage well, we all know that the cooler and VRM is not up to the task. But why apply the same voltage methodology to a card like the MSI lightning or EVGA classified? These cards are specially equipped with hardware that is completely fine with handling the additional voltage! This is the main reason I do not understand the change. A reference with voltage locked? Yeah, thats fine with me. But why apply the same methodology with substantially better cooling and hardware?

Yeah it sounds a lot better than just allowing everyone to bump up the voltage. What drove the OP to create this thread was his dissatisfaction stemming from a purchase based on a of a special sample 7850.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yeah it sounds a lot better than just allowing everyone to bump up the voltage. What drove the OP to create this thread was his dissatisfaction stemming from a purchase based on a of a special sample 7850.

Can you reword this or correct the grammar so I can understand your points, please? I'm not sure what you are getting at.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
I put doesnt bother me, the frame rate difference between OV OC & just OC isn't much to a gamer, maybe if you get into benchmarking?
Just my 2cents
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
If it hold true and Nvidia doesnt change their posistion on this i will never purchase another Nvidia product.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I put doesnt bother me, the frame rate difference between OV OC & just OC isn't much to a gamer, maybe if you get into benchmarking?
Just my 2cents

Voltage control allows HD7950 to hit 1200mhz on occassions, or a 50% overclock. That's huge! That's like buying a $310-320 GPU and getting faster performance than $450-550 GTX680 / 7970 GE cards.

If anything this is NV throwing a life-line to AMD by giving up a key competitive advantage. AMD's management could retain the dual-BIOS switches and full voltage control and instantly you have a differentiation between AMD and NV cards that AMD didn't have to pay much for since NV removed overclocking via voltage control. I suppose ATI/AMD has generally been more enthusiast friendly:

- 9500Pro could be flashed into a 9700
- X800GTO2 could be flashed into X800XT
- HD6950 could be unlocked into a 6970
- AMD sent a free update to 7950/7970 owners with faster BIOSes should users want that option.

I still think GTX600 was a profit margin part for NV like IVB was for Intel. GTX600 continues to portray the image of being a cost-cutting part used because it was good enough to go against 7970 and because NV couldn't launch GK110 on a mass scale. GK104 was made to make as much $ for the company as possible (294mm2 for $499 MSRP, since when did NV low-ball like that?). Cheap PCBs, 4 VRMs, flacky/crackling reference fans, kept memory bandwidth at GTX580 level for flagship cards, got rid of a large chunk of compute functionality, etc.

Let's hope the GTX600 series is an outlier and GTX700 series is much better built with proper voltage control and good quality fans/VRMs/PCB that can take a beating.
 
Last edited:

Keromyaou

Member
Sep 14, 2012
49
0
66
What RussianSensation kept on advocating about the market behavior (no matter how much cost efficiency has been much better for AMD than Nvidia cards, consumers keep on buying Nvidia cards) may start working. Nvidia might think that consumers will buy their cards for premium prices no matter how much they ignore their fan base. It is the same as many game developers such as EA or Ubisoft. No matter how many DRMs they put in their games and screw their customers, gamers keep on buying their games. So they will keep on doing this type of practices. Personally I hope that Nvidia will bring back voltage unlock for gtx780 cards. If not, I might go to HD8970 instead. I hope that AMD doesn't go down the same practice for HD8970.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
It's terrible on several fronts. For the enthusiast buyer shelling out $500+ for a top end card losing the ability to tweak it, or the mid-range enthusiast looking to get a more affordable card and tweak it to higher performance.

There is a clear advantage having voltage control. Look at an AMD Radeon 7950 with voltage control, tweaking voltage makes that $300 card as fast or faster than a $500 GTX 680. Granted the 680 is basically a red lined and bandwidth limited card regardless. There is not enough memory bandwidth on the stock card already, overclocking it makes that more evident.

I think we'll see voltage control on the GK110 consumer card. Pretty sure the reason it's locked on GK104 is they are already redlining the chip to try to stay competitive and much more voltage would kill/degrade the cards.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's terrible on several fronts. For the enthusiast buyer shelling out $500+ for a top end card losing the ability to tweak it, or the mid-range enthusiast looking to get a more affordable card and tweak it to higher performance.

There is a clear advantage having voltage control. Look at an AMD Radeon 7950 with voltage control, tweaking voltage makes that $300 card as fast or faster than a $500 GTX 680. Granted the 680 is basically a red lined and bandwidth limited card regardless. There is not enough memory bandwidth on the stock card already, overclocking it makes that more evident.

I think we'll see voltage control on the GK110 consumer card. Pretty sure the reason it's locked on GK104 is they are already redlining the chip to try to stay competitive and much more voltage would kill/degrade the cards.

Yes. I never put much stock into the degrading performance over time rumor before, but you might be on to something.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
There is nothing odd about this. First NV took away the ability to use physx unless you are all NV, then custom refresh rates, now they the ability to change voltage, eventually they will take away the ability to change the clocks entirely except through hacks.
Once consumers can't change anything they will then be able to sell the same chip at different prices by simply changing the core clock and profit.

NV has done this thing before with the 9600 which had the same exact chip as the 9800 which they had an over abundance of. When demand is high enough and yield is low enough binning makes sense, but eventually there is more demand for mid range parts and a surplus of high grade chips so they have to degrade the parts or find other means to sell midrange cards without canibalizing their high end cards.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The other thing is if you know exactly that you are making a reference board to handle voltage of 1.175V and not much above, you don't need to over-engineer the card's components and heatsink to account for additional load on VRMs and other PCB components. Then there is the issue of the cooler's ability to handle the added heat. Kepler loses its GPU boost bins above 70*C at 5-10*C increments I believe? Voltage control could have resulted in cards going to 85-90*C and as a result having little performance gain than stock cards or cards overclocked without voltage. I don't own a Kepler card so I can't confirm this.

Yes, the 480 ran hot and loud but man that card was built like a tank and took it. You could literally fold with that thing 24/7 for 2 years at 830mhz and not worry that it would die on you.

GTX480
09_480_fro1_big.jpg


vs.

GTX680
26_gtx68_pcb_big.jpg


Now heatsinks:
GTX480
heatsink1.jpg

16_480_radp_big.jpg


GTX680
32_gtx68_coolrad_big.jpg

34_gtx68_coolradr_big.jpg


GTX480 also had a ribs on the metal heatsink for the rest of the PCB and memory:
19_480_cool_big.jpg


GTX680 plate looks all show and little of anything that helps to dissipate heat:
25_gtx68_oo_big.jpg


The little ribs probably don't do much but these little details show that NV designed the 680 as cheaply as possible.

GTX470 Heatsink
GTX470-11.jpg


GTX670 Heatsink
NV-GTX-670-12.jpg

^ talk about shoddy quality control - a $399 card with crooked heatsink fins. EVGA went out of their way to put their own designed stock GTX670 heatsink since the quality of the reference 670 heatsink was so awful.

"Unsatisfied with the reference heatsink, EVGA has expanded its reach and thermal mass by about 25% through the addition of a fin overhang over the PWM area." ~ HWC

NV pulled an IVB with GTX600 series by cutting corners everywhere and I think removing voltage control reduced the chances of many failed parts, like the blown up 570/590s they had last round. Still, that shouldn't have mattered since as Blackened23 stated, EVGA and MSI included bullet-proof components on EVGA Classy and MSI Lightning parts to ensure that they are able to handle the added voltage with aplomb. NV could have been afraid that even users with regular 670/680 reference cards would start overclocking with voltage control in MSI AB and then they could have dealt with a PR disaster of high rates of GPU failures.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Or they wish to ensure the chip is run within the specifications it was designed to run it, board and all?
I don't think this was originally planned as a high end card, though that's what has happened?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Vote with your wallet if you don't like it.

Also, it's not entirely fair to compare a high-wattage part like GTX 480 vs GTX 680. The GTX 480 was built on 40nm and was rushed out, compared to GTX 580 (also on 40nm but with tamer wattage). The GTX 680 is a 28nm part with much lower wattage so it needs much less cooling. This is a fairer comparison of cards with similar TDP and shows that the GTX 670 has a downright pathetic heatsink: http://www.overclock.net/t/1256156/serious-cost-cutting-on-the-gtx-670s-heatsink

And as you noted, the stock GTX 680 cooling is pretty pathetic even considering the wattage difference between it and GTX 4/580, and I think they went a little overboard with all their cost-cutting, from shrinking the PCB to making weaker, cheaper cooling, to skimping on memory bandwidth, etc. etc. As was widely reported and EVERYBODY should know this by now, the GK104 chip was originally intended to be midrange, not high-end.

And it shows.

Hopefully NV will allow voltage control in future GPUs that were intended to be high-end from the get-go.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
NV pulled an IVB with GTX600 series by cutting corners everywhere and I think removing voltage control reduced the chances of many failed parts, like the blown up 570/590s they had last round. Still, that shouldn't have mattered since as Blackened23 stated, EVGA and MSI included bullet-proof components on EVGA Classy and MSI Lightning parts to ensure that they are able to handle the added voltage with aplomb. NV could have been afraid that even users with regular 670/680 reference cards would start overclocking with voltage control in MSI AB and then they could have dealt with a PR disaster of high rates of GPU failures.

Well if one did have a chance to compare the BOM for the EVGA classy/MSI lightning with the reference, you'd know why they dont over build the cards with what Id think would be "useless" components/materials as only a few minority might take advantage of that headroom. Plus the stock cards would have gone through alot of tests to 100% confirm that these will operate fine (as long as they remain within spec) at most user case scenarios.

The GTX480 isn't such a good comparison either since the GPU itself required lots of power and they had to actually come up with a better power circuitry and cooling for the card (I mean we are talking about Fermi!). It was an exception. If they did that for the GTX680, then that would be a failure on the engineering team's part.

I dont mind nVIDIA removing voltage control because not many people understand what kind of consequences moving those sliders can have. I do agree that they wanted probably wanted to keep the premature GPU failure rate low given the history of GTX570/GTX590s (The margins must've been so thin for a card like the GTX590 to have barely any overhead margin on its VRM design hence they cut too many corners on an already difficult goal) But what I do have a problem is nVIDIA forcing the AIBs to not provide voltage control for the enthusiast community. Unless of course nVIDIA has the test data of the core voltage limitations vs various parameters that have been supplied to its AIBs which resulted in them enforcing the vcore limit of 1.175V.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Vote with your wallet if you don't like it.

Also, it's not entirely fair to compare a high-wattage part like GTX 480 vs GTX 680. The GTX 480 was built on 40nm and was rushed out, compared to GTX 580 (also on 40nm but with tamer wattage). The GTX 680 is a 28nm part with much lower wattage so it needs much less cooling. This is a fairer comparison of cards with similar TDP and shows that the GTX 670 has a downright pathetic heatsink: http://www.overclock.net/t/1256156/serious-cost-cutting-on-the-gtx-670s-heatsink

And as you noted, the stock GTX 680 cooling is pretty pathetic even considering the wattage difference between it and GTX 4/580, and I think they went a little overboard with all their cost-cutting, from shrinking the PCB to making weaker, cheaper cooling, to skimping on memory bandwidth, etc. etc. As was widely reported and EVERYBODY should know this by now, the GK104 chip was originally intended to be midrange, not high-end.

And it shows.

Hopefully NV will allow voltage control in future GPUs that were intended to be high-end from the get-go.

Agreed with the GTX480 comparison. I think the stock cooling is perfectly fine for the GTX680 but thats mainly due to how power effucent the GK104 chip is. Its one of the quietest stock cards Ive ever owned that used a blower design. But then you do have a point on the GTX670s. Its not as bad as this though :p

Shrinking PCBs doesn't necessarily make it "weaker". Not sure what your referring to here but the only disadvantage I can think of is maybe lack of real estate for bigger components? Actually you'd want shorter PCB traces especially for the power stages. Cheaper stock cooling is another debatable topic. Im surprised no one blames intel for supplying their stock coolers each for an enthusiast is god awful compared to what we get on video cards!

I guess they could have packed a bigger vapor chamber cooler with 5~6 phase VRMs with that nice software voltage control (it would've been a fantastic reference card) but if you were in their shoes looking at the data + cost numbers, you'd think otherwise.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Vote with ones wallet if over-volting is that important to a gamer. If AMD continues to support a dual bios and allows over-volting, strong scaling and impressive over-clocks, there may be a competitive advantage and differentiation for AMD.

It seems that nVidia likes their GPU Boost, balance and performance efficiency and may believe these strengths out weigh over-volting for the future.

Personally would like to see nVidia AIB's offer differentiation and OC/OV flexibility for their potential customers but nVidia really feels strongly about this by their actions.
 
Last edited:

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
My previous Gigabyte GTX 570 (reference design) burnt because of a slight voltage increase. Reference GTX 570's had barely sufficient VRM circuitry for stock operation as it was, so I think my slight Voltage adjustment from about 0.975v stock to 1.050v with a 850MHz GPU overclock killed it over time.

When I removed the cooler, after the card was dead I saw one of the Voltage regulators had burnt.

Now I have two ASUS GTX 670 DC II 'non Top' in SLI. Even at stock voltage my upper card can go well over 70c, even with a custom fanprofile. Even if I could up the voltage I would not do it. Would probably just give a few extra MHz for alot of added heat. Not worth it for me.

So no, I don't miss voltage adjustments on my Nvidia cards. They are fast enough as they are. Also, I want them to last a few years.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,124
3,063
146
I must say this rules out nvidia from any purchase for me. Big thumbs down.
 

bononos

Diamond Member
Aug 21, 2011
3,928
186
106
Can you reword this or correct the grammar so I can understand your points, please? I'm not sure what you are getting at.

Originally Posted by [B said:
bononos][/B]
Yeah it sounds a lot better than just allowing everyone to bump up the voltage. What drove the OP to create this thread was his dissatisfaction stemming from a purchase based on a review of a special sample 7850.


There, I had left out the bolded word. I meant to say that the review sample was either a cherry picked card or a pre-production card with a beefed up design.