Tech Report: Nvidia, Asus put the clamps on GTX 590 voltage

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Regardless of the number, whether it's 1 or 100 @ x amps, it would appear it wasn't enough on the GTX590 for adding voltage and overclocking. ;)
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Yes. I was mistaken. My apologies.

No no. I apologize for sounding too combative. :oops:

But I feel so much disinformations and even straightforward lies has been said, while even basic things get lost in the heat.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
No no. I apologize for sounding too combative. :oops:

But I feel so much disinformations and even straightforward lies has been said, while even basic things get lost in the heat.

I agree there are some people who do that here, but this site did some nice extreme overclocks on both the 590 and the 6990 by strapping Thermalright Ultra Extremes onto each GPU.

Its Romanian so it might be hard to understand even with google translate.

GTX590
http://lab501.ro/placi-video/nvidia-geforce-gtx-590-studiu-de-overclocking

HD 6990
http://lab501.ro/placi-video/asus-radeon-hd-6990-studiu-de-overclocking
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I was on that site many times these last couple of days. Read all 6990/590 articles several times, and would have soooo much to ask those guys

BTW Romanian reiewers = 3 dead cards. Rest of the europe 6-7.

Rest of the world = ZERO

correction ... 1 middle eastern/asian site is mentioned also
 
Last edited:

Firestorm007

Senior member
Dec 9, 2010
396
1
0
I thought this was funny.

478x88.jpg
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
I don't exactly know when programmable VRM was first introduced on video card, but it wasn't until Fermi/Cypress that it becomes well known.

Not true. I first tried software voltage control on a X1800XL, enthusiasts knew about it at that time. Just to be clear, I don't think non-enthusiasts know at all about voltage control even with Fermi/Cypress. I believe the X1XX cards from ATI were the first to make it "mainstream" for enthusiasts (ie. pure software, no hard mods).

The fact that it can pull as much power as 580 SLI and still not fail is quite impressive.

It is faster there, but I DOUBT the 6990 even at 972 core would be pulling as much power as 580SLI. It would be literally impossible to cool it on the stock cooler.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
"I don't see any throttling here"
You would have to see a graph of results for clock speeds in between ,
'TO SEE IT'
Looking at those results VS the 6990@830mhz it took a 140mhz overclock to gain 5fps.
Whats that 17% o/c for 10% gain ?

From the same review the, final conclusion.
Not sure how the 590 can be a fail, when its a draw. Even if you can get another 10% out of the 6990.
GTX-590-82.jpg
 
Last edited:
Feb 19, 2009
10,457
10
76
I dont even know why you guys are even debating the merits of the 590 reference PCB. It's woefully inadequate. It has 10x 35A VRMs at 80% efficiency for both GPUs. At its stock vcore, both GPUs have ~300W max b4 the VRMs get into stress zone. It's clear why NV has released drivers to prevent vcore manipulation as it will shorten the lifespan of these components to run well beyond specs (they are already running beyond specs at stock) when OC further.

The 6990 reference PCB is over-engineered. It's overpowered. For enthusiasts who decide to go water cooling, you can overvolt and get ridiculous OC to 1.1ghz. Even on air it can reach 1ghz (albeit noisy, but its stable and won't explode). Each GPU on the 6990 has ~ the same VRM capabilities as both GPUs on the 590. 4x 80A Volterra VRMs each GPU.

As for the 590 being similar to the 6990 in performance. No. Minus 1080p res, minus older games like COD, ME or Hawx where both cards get 200+ fps, the 6990 is >10% faster at stock. You can get reviews to skew the results how you like, but ultimately the performance in newer dx11 demanding games is all that matters when these cards are $700. Who really cares about a $700 GPU doing 250 fps vs 300 fps in old games?? Even in the event of SSAA in ME2, once its enabled, the huge fps lead the 590 has becomes insignificant.

I am looking forward to the next steam update and see how well either cards have penetrated the market.
 
Feb 19, 2009
10,457
10
76
Not true. I first tried software voltage control on a X1800XL, enthusiasts knew about it at that time. Just to be clear, I don't think non-enthusiasts know at all about voltage control even with Fermi/Cypress. I believe the X1XX cards from ATI were the first to make it "mainstream" for enthusiasts (ie. pure software, no hard mods).



It is faster there, but I DOUBT the 6990 even at 972 core would be pulling as much power as 580SLI. It would be literally impossible to cool it on the stock cooler.

Vcore software control has been around for a long long time. I used some 3d party software to up the voltage on the Rendition based Diamond Stealth 3D. This GPU was so advanced for its time, it came with hardware anisotrophic filtering and AA. It wasn't popular due to the voodoo 1 getting 30 fps in Quake and it only got ~20 fps and was considered unplayable. I overclocked it and got 30 fps while still enjoying crisp textures and less jaggies. :D There was also many good times to be had in Jedi Knight, amazing graphics with the Rendition GPU.

http://www.anandtech.com/show/29/3

I added some copper heatsinks to the core and ram manually to get some nice clock boosts.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106


This is throttling due to GPU temps. Not at all the same thing nVidia is doing to the 590, which is what I said. I can't remember there ever being a card that has had the same issues as the 590. [sarc] Of course there is nothing wrong with the 590 it's all user error and a giant conspiracy [/sarc].

Unfortunately, if you want the type of performance that the 590 had the potential to offer, you are going to have to fork over a grand and use 4x slots on your mobo. You are going to have to buy 2x 580's. End of story, because this card is EPIC FAIL.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
Even if you had a 590 you still think you should/would or want to be able to? :cool:

Yes and no. Yes as in: I would like to overclock it, but No as in: I don't want to overclock a poorly engineered card that might burn up if I do.

Shutting down overvolting was a good call from Nvidia, but that doesn't change the fact that going cheap on vrms was truly lame. And I mean LAME.

Other than that, there is no reason whatsoever not to give graphics and processors a good run for their money. If I can overvolt and OC them, I will, because:

A: They go faster
B: I like overclocking
C: They go faster

And no, "stock-fast" is never fast enough. And I don't get how anyone can defend Nvidia going cheap on components. If the HD6990 (the competition) can do it, so should GTX590. I don't see any reason whatsoever to pick the crippled 590 over Antilles.

I don't disagree and, im one like many here that OC/OV there hardware and i have said it many times now that REFERENCE GTX590 is not for the big boys.

But we are not the only customers of cards like HD6990 and GTX590, there are people who don’t OC/OV and still get the high end products.

Just because a product don’t OC or OV like we wanted it doesn’t make it of a lower quality or not properly engineered. If it doesn’t pass your standards then don’t buy it, move on and get the best you thing it suits your needs.

No the REFERENCE GTX590 is not, but the REFERENCE HD6990 is. Less OC-happy consumers would not suffer if Nvidia would've done it properly, overclockers do however suffer when Nvidia goes cheap on components. This card is of lower quality (vrm-wise) than the competition and even other cards from Nvidia.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I dont even know why you guys are even debating the merits of the 590 reference PCB. It's woefully inadequate. It has 10x 35A VRMs at 80% efficiency for both GPUs. At its stock vcore, both GPUs have ~300W max b4 the VRMs get into stress zone. It's clear why NV has released drivers to prevent vcore manipulation as it will shorten the lifespan of these components to run well beyond specs (they are already running beyond specs at stock) when OC further.

http://html.alldatasheet.com/html-pdf/312873/INFINEON/TDA21211/5468/14/TDA21211.html

At default Voltage of 0.938V the MOSFET will give (35A * 0.938) = 32,83W * 5 MOSFETs (Per GPU) = 164,15W (im not taken the efficiency in to consideration in order to get it simple). That it the Power each GPU will draw.

Total wattage will be 2 GPUs x 164W = ~328W

Now since the MOSFET can output more than 1V (up to 1.6V VOUT) then at 1.0V the MOSFETs will give (10x 35A x 1.0V) = 350W

At 1.05V they will give (10x 35Vx 1.05V) = 367,5W

From Figure 9 page 14 of the DrMos(MOSFET) diagram we can see that efficiency goes up the higher the Voltage.

So where do you see the VRM implementation is beyond specs at stock (Default) Voltage ??

The 6990 reference PCB is over-engineered. It's overpowered. For enthusiasts who decide to go water cooling, you can overvolt and get ridiculous OC to 1.1ghz. Even on air it can reach 1ghz (albeit noisy, but its stable and won't explode). Each GPU on the 6990 has ~ the same VRM capabilities as both GPUs on the 590. 4x 80A Volterra VRMs each GPU.

Are you sure about that ?? Because I cant find what MOSFET the HD6990 uses.

http://www.cooperbussmann.com/pdf/6fa9cdf3-08ff-48ee-b782-8367f81da5c9.pdf

If you referring to CLA1108-4-50RT-R, then this is an Inductor and not a MOSFET and by the way this is a COOPER BUSSMANN Inductor licensed for use with a Volterra Voltage Reulator.

We do know that the HD6990 VRM implementation has two Volterra VT1556 Voltage Controller in the back of the card and two COOPER BUSSMANN CLA1108-4-50RT-R Inductors (choke) (one per GPU chip) with 4 MOSFETs per Inductor but we don’t have any specs of the MOSFETs.

So we do know that the Inductor can take a maximum peak current of 80A but we really don’t have a clue what the MOSFETs can give and by that we don’t know the power each GPU chip can draw.