Nvidia is back at it again with afterburner voltage locking.

utahraptor

Golden Member
Apr 26, 2004
1,052
199
106
The new 2.2.4 removes voltage control from all cards by demand of Nvidia. Unwinder did reveal a config file to fix this, but I have also heard that the new lightning are shipping with crippled bioses and the old versions do not work. I can't confirm this.

http://forums.guru3d.com/showthread.php?t=368609
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Just tried voltage adjustments with my MSI Power edition GTX 670 and allows it with 2.2.4.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
What card you got ? It works fine for sirpauly and I bet thousands of people.

Go back to 2.2.3

gl

Use CrystalDiskInfo and tell us your SSD percentage health ?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Voltage control works fine. Unwinder has a profile edit included which allows voltage control for MSI lightning and PE cards. The BIOS bit is also not true, and even if it were - you can simply flash to the old one. The latest BIOS for the lightning is 3A and allows for full voltage control.

On 2.2.4 you may edit Lightning hardware profile files (.\Profiles\VEN_10DE&DEV_1180....cfg) and add the following lines there:

[Settings]
VDDC_Generic_Detection = 0
VDDC_CHL8318_Detection = 46h
VDDC_CHL8318_Type = 1

I am using this profile myself and can use any voltage up to nearly 1.4V, using lightning 680s.

I think nvidia are pretty stupid to apply this to any and all cards, but at least unwinder is giving us a workaround. The design paramaters (voltage) of a reference card should not apply to a card like the lightning. What a lightning can do, a reference can not due to hardware differences. I fail to see why nvidia can't realize this.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Nvidia does not control AB.

In fact other models have to jump through more hoops, moving .dlls around and altering config files. Standard sort of stuff, to keep o/c noobs from destroying the cards.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Nvidia does not control AB.

In fact other models have to jump through more hoops, moving .dlls around and altering config files. Standard sort of stuff, to keep o/c noobs from destroying the cards.

Not true. Unwinder and MSI were forced to modify afterburner because of nvidia. Nvidia threatened MSI - and they were forced to change afterburner. Its all on guru3d.

so in this case, it was nvidia that forced their hand. However, unwinder gave us a workaround. So the net effect is, who cares....we still get voltage control, Its all gravy.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I read it as MSI had to use a specific bios on their cards, like other manufacturers.

from 3d guru web site

While I was writing this article I learned that NVIDIA just issued new BIOS files to the AIC partners and is frowning upon voltage tweaking outside their limitations. As such all new batches Lightning cards will have BIOSes where their limit of 1.175V is enforced, even in the LN2 BIOS. MSI has to follow that directive or probably face the fact that they will not be able to purchase the GPUs anymore.

That means that only the first batch of 5000 cards will have an OLD Bios that is freed up from the limitation and thus allows voltage tweaking to a certain extent. We can only assume that the old BIOS will spread like a virus to current Lightning owners to give them a little more flexibility on voltage tweaking matters.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I read that as well. As it turns out, the latest BIOS for the lightning is 3A and it still allows voltage through software - the BIOS thing was speculation but did not actually happen.

Anyway... crisis averted, we still have voltage ;)
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Why would Nvidia control voltage? I'm not denying they do, but why would they? To protect the cards from failure? To keep the cards in the performance bracket they want them in? To prevent the GK110 from looking like a waste of money when released?
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
Why would Nvidia control voltage? I'm not denying they do, but why would they? To protect the cards from failure? To keep the cards in the performance bracket they want them in? To prevent the GK110 from looking like a waste of money when released?

Your second guess makes the most sense from a fiscal standpoint. To keep the cards within a performance bracket.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Why would Nvidia control voltage? I'm not denying they do, but why would they? To protect the cards from failure?
Bingo. Look at how much grief voltage control has caused them. On the one hand you have idiots breaking cards by applying too much voltage and then complain about it publicly as if it were NVIDIA's fault (e.g. GTX 590) and on the other hand you have partners like Zotac who in their chase for factory overclocks would ship their cards at uncomfortably high voltages for a retail product.

NVIDIA isn't vertically integrated, but they're not horizontal either. So protecting the GeForce brand has come to mean a great deal to them.
 

paperfist

Diamond Member
Nov 30, 2000
6,517
280
126
www.the-teh.com
Why would Nvidia control voltage? I'm not denying they do, but why would they? To protect the cards from failure? To keep the cards in the performance bracket they want them in? To prevent the GK110 from looking like a waste of money when released?

Apple is successful for a reason. It may not be in all cases and it doesn't always make sense, but when you limit things preventing people who don't know what they are doing from causing damage it's a good thing (for nVidia). This is the internet era, just look at newegg's reviews and wonder how many of the 1 -2 eggs are due to user's FUBAR a product they bought.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Your second guess makes the most sense from a fiscal standpoint. To keep the cards within a performance bracket.

No, the first does.
Warranty claims and the threat of, "LOL housefire Kepler" are a far bigger threat than a few people unlocking 2% more OC headroom.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Bingo. Look at how much grief voltage control has caused them. On the one hand you have idiots breaking cards by applying too much voltage and then complain about it publicly as if it were NVIDIA's fault (e.g. GTX 590) and on the other hand you have partners like Zotac who in their chase for factory overclocks would ship their cards at uncomfortably high voltages for a retail product.

NVIDIA isn't vertically integrated, but they're not horizontal either. So protecting the GeForce brand has come to mean a great deal to them.

Yep, it's too bad for enthusiasts who like to tinker. I suspect that if it actually succeeds in reducing RMA and warranty work, AMD will eventually follow suit. Why cater to 3% of your customers when you can save 5-10% in costs with something this easy to implement?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Even if AMD blocks voltage control as long as they retain dual-BIOS switches, it doesn't matter. The modding community will just release voltage modded BIOSes and you'll be able to flash your card safely with them. Neither NV nor AMD ever clamped voltage control like this before. It's doubtful that RMA and warranty costs are the main reasons, but there weren't for 10 years when NV and ATI/AMD allowed voltage control? Also, why would RMA not be an issue for CPU? This sounds more complicated than that. If we look at GTX660Ti/670/680, if you had voltage control on the 660Ti and could raise it to 1.3V, maybe it's possible it would hit 1400mhz on air. All of a sudden the entire GTX670/680 lineup is in question. I am sure NV likes collecting those $100 premiums between these 3 tightly squeezed cards. It was much easier to differentiate GTX570 since it has 1.28GB of VRAM. The 670 vs. 680 are much slower in performance than 570 was to the 580. NV could have clamped voltage control to not have a situation where AIBs would release way faster GTX670 cards than 680 cards.

There could be another reason: Kepler's sophisticated GPU Boost + dynamic voltage adjustment to support it makes overclocking a lot more complicated and dangerous. Yet another reason is that Kepler feels much like IVB - a part made to make $, with cost savings around the edges. It doesn't seem at all like NV built high quality reference cards this round, definitely nowhere near the quality of 470/480s. I mean the GTX660Ti/670 reference cards look like $150 GPUs at best. Flacky PCB/VRMs and terrible fans. Sounds like budget cost cutting. Perhaps because NV made such simple PCB/power circuitry design that they would fear large RMA/warranty costs if they allowed voltage control on the entire line because this round they put together the bare minimum in terms of components.

AMD's current GPU boost implementation is a lot simpler. The GPU just cranks voltage to the max, the clocks to the max. This is why AMD doesn't have to worry about the complexity of dynamic GPU voltage. If AMD makes a more sophisticated version of GPU boost that involved dynamic voltages, they could also follow suit. However, considering AMD was the one who brought dual-BIOS switches, and has Black Series AMD CPUs, I expect them to retain manual voltage control support simply because it would be a key competitive advantage. The fact that they brought in dual-BIOS switches shows they are letting the users have a free reign at will with BIOS flashing, overclocking, etc.

I still think the main reasons are the dynamic GPU Boost / voltage regulation on Kepler and the tightly integration of GK104 product stack with all having 2GB of VRAM. NV probably saw how AMD flopped by launching HD6950 that unlocked and lost a ton of sales of the HD6970. Clamping down overclocking ensures people still pay the premium for the 680.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think a big reason for it is that Nvidia pushed GK104 hard (in terms of clockspeed/gpu boost) out of the gate in order to favor well with the already released 7970. We can never be sure but I believe this is the case. I see this as the reason why nvidia wants voltage control, because the card is pretty close to its potential with gpu boost alone.

AMD has never had any type of fiasco like 590s blowing up or drivers causing cards to overheat (and I fully realize that this is the fault of users, at least the 590 part), so while it's always possible for AMD to follow suit I don't see it as likely. They've actually encouraged overclocking quite a bit with their cards and they haven't had any incidents with damaged cards in contrast to nvidia. Who knows though.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I think a big reason for it is that Nvidia pushed GK104 hard (in terms of clockspeed/gpu boost) out of the gate in order to favor well with the already released 7970.

People keep saying this but I don't buy it at all. Firstly GK104's run at 1.175 volts when in turbo. That isn't crazy high. Further, it's more efficient than Tahiti in perf/watt. If it was being "pushed hard" more so than any version of Tahiti, then it wouldn't be more efficient. The gains in perf/watt Nvidia made from 40nm to 28nm also do not vibe with the that GK104 is being pushed hard. And - here's the real proof - my card reaches 1350mhz with voltage adjustment, IMO that is PLENTY of core clock headroom. Just as much as Tahiti when voltage adjustment is allowed.

If anything, I think they clocked the card as high as they could within the constraints of a 256-bit memory before getting severely diminished returns on performance vs. ramped up power use.

Nvidia has a business to run. They can't just say "CLOCK IT AS HIGH AS IT WILL GO AND SHIP IT!" They have to find the sweet spot of yields and functionality or they simply won't have enough product to sell.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Even if AMD blocks voltage control as long as they retain dual-BIOS switches, it doesn't matter.

Well having a dual bios would kind of defeat the point of locking out voltage control, wouldn't it?

I'll say again, if Nvidia locked out voltage control for RMA reasons, and they do end up saving money with less RMA's, then AMD would be financially retarded not to follow suit. If Nvidia did it because Kepler is bandwidth strapped up and down it's entire lineup and overclocking beyond a certain point increase performance any longer, then it's hopeful that voltage control may return again someday.