SweClockers: Geforce GTX 590 burns @ 772MHz & 1.025V

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
You are thinking that voltage is what is killing the VRMs, bit it's not. It's the current that's killing them. By reducing the clock speed of the GPU you reduce the amount of current it draws.

Thats what i was trying to get at. Lowering core clock reduces power requirements, leading to a drop off in current the VRM have to push out to the cores. They are rated at a 35A "safe" max, thats current not volts. They can handle varying voltage. But the problem is if you up the vcore and OC you are also enabling the GPUs to essentially pull more AMPs from the VRMs than they can cope with thats why they will blow up.

It all comes down to the electrons flowing. :)

At least we can agree the reference design should not be OC by users else its at real risk of dying. The gtx580 is a great GPU, we all know that. These GPUs deserve better PCBs to make them shine.

Edit: @previous post: the 6990 have been pushed to 1.4vcore and it survived, because its components are high quality and theres more than the card needs by far. On an open benchtop setup, OC 6990 with sound measured 30cm away, was 54 dB. OC gtx590 same condition, 49 dB. 1m away inside a case, 46 dB for the 6990. It's noise is overrated.
 
Last edited:

pcm81

Senior member
Mar 11, 2011
598
16
81
Actually with the same load raising the voltage would mean lower current needed (P=VI)
except it actually works the other way around.

P=VI

V=RI

Increasing V you increase I because R is constant. As the result you increase P.

P= VI=V*(V/R) = V^2/R so what you actually have here is a square relationship, increasing Voltage, for example by factor of 2 increases Power by factor of 4... In reality, increasinv V from 0.9V to 1 V is like increasing Power by (1/0.9)^2 times.


EDIT:
theoretically you could add a high power resistor in series with the load from power circuit, increasing resistance of the card and driving down the current draw at higher voltages... But that may result in insufficient current suppy to feed the GPUs, plus if you will be doing this much moding, then just change the power supply...
 
Last edited:

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
well i wished i kept up on this thread more... i bought a gtx 590 to replace 2x 6970 in xfire since the performance was ~15%, and in a small handful of cases, a little higher with the 590. this made me happy because i can still retain roughly the same performance, but have the option for 3D with my asus 120hz monitor. i just yanked my cards out this morning to put my new 590 in when i got home..... guess i should box it up and send it back. the last thing i want is leaving a game paused and running out to check the mail really quick and coming back to my condo that caught fire.

usually you get performance enhancements with updates... maybe instead of keeping the fan so quiet, they should let it crank up and actually COOL the card!
 
Last edited by a moderator:

TerabyteX

Banned
Mar 14, 2011
92
1
0
well i wished i kept up on this thread more... i bought a gtx 590 to replace 2x 6970 in xfire since the performance was ~15%, and in a small handful of cases, a little higher with the 590. this made me happy because i can still retain roughly the same performance, but have the option for 3D with my asus 120hz monitor. i just yanked my cards out this morning to put my new 590 in when i got home..... guess i should box it up and send it back. the last thing i want is leaving a game paused and running out to check the mail really quick and coming back to my condo that caught fire.

usually you get performance enhancements with updates... maybe instead of keeping the fan so quiet, they should let it crank up and actually COOL the card!

Replacing a pair of HD 6970 for a GTX 590?? :confused: :oops:
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
It is funny how AMD's competitors, Intel and Nvidia, have both released defective products. AMD can now market "Ready, Willing, and Stable" with their graphics cards as well.
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
Replacing a pair of HD 6970 for a GTX 590?? :confused: :oops:

as i said, in my post, performance was close enough and would let me do 3D. the plan was to get a second one and do 3D surround, but my distributor would only give me 1 for free :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
as i said, in my post, performance was close enough and would let me do 3D. the plan was to get a second one and do 3D surround, but my distributor would only give me 1 for free :)

You only need 1 GTX590 to do 3D Surround. I play Crysis2 every night in this fashion.
 
Feb 19, 2009
10,457
10
76
NV builds the 590 for Enthusiasts
crafted.jpg


AMD builds the 6990 for ??
engineering.jpg
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
You only need 1 GTX590 to do 3D Surround. I play Crysis2 every night in this fashion.

Sweet. Then I won't have to bother anyone for a second. I'm still going to try it out. Worst that can happen is it fries and I rma itnd go back to my xfire 6970 amd use those until after market coolers come out for the 590!
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
i am an enthusiast, I am very enthusiastic about my tech... I am not at the least bit interested in dual GPU single cards.
two separate single video cards with one GPU each are better in every possible way.

I am also very enthusiastic about perf/$, noise levels, and heat levels.
Multi monitor has those terrible nasty bezels ruining it, and 3D is a terrible gimmick requiring painful to use glasses... Oh and physX is a joke.

Best thing I ever used was a projector with a properly wired surround connected to a desktop computer. This was EPIC gaming right there... you haven't played warcraft 3 / starcraft until you played it on a 150 inch screen.

Anyways, I went on a bit of tangential rant there... but bottom line is, what kind of rich enthusiast is going to say "mmm, I want to spend 700$ on a video card... I know, I will buy a dual GPU card instead of two single cards, so that I get something inferior in every possible way." Even if that person doesn't have a mobo with two PCIe 16x slots, they can afford to get one for the superior performance with the amount of money they are investing in it.

Only point I see for those cards is brag rights (for the company, not the card's owner), and for people who want to use 4 GPUs... and support for that is in the gutter.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
i am an enthusiast, I am very enthusiastic about my tech... I am not at the least bit interested in dual GPU single cards.
two separate single video cards with one GPU each are better in every possible way.

I am also very enthusiastic about perf/$, noise levels, and heat levels.
Multi monitor has those terrible nasty bezels ruining it, and 3D is a terrible gimmick requiring painful to use glasses... Oh and physX is a joke.

Best thing I ever used was a projector with a properly wired surround connected to a desktop computer. This was EPIC gaming right there... you haven't played warcraft 3 / starcraft until you played it on a 150 inch screen.

Anyways, I went on a bit of tangential rant there... but bottom line is, what kind of rich enthusiast is going to say "mmm, I want to spend 700$ on a video card... I know, I will buy a dual GPU card instead of two single cards, so that I get something inferior in every possible way." Even if that person doesn't have a mobo with two PCIe 16x slots, they can afford to get one for the superior performance with the amount of money they are investing in it.

Only point I see for those cards is brag rights (for the company, not the card's owner), and for people who want to use 4 GPUs... and support for that is in the gutter.

And the best thing about using projectors in surround. No bezels.
 

Morg.

Senior member
Mar 18, 2011
242
0
0
Yeah he says the voltage circuitry on the 6990 is more suitable for Welding rather than rendering Graphics.

AMD are providing each Cayman GPU with 4x80AMP inductors, and Nvidia gives each GF110 GPU 5x35AMP inductors.

Well .. that's quite a statement.
If those Volterra inductors have their peak @ 320Watts (1 Volt), I believe even those *can* be limiting in a hard OC.
However, even in that case, it still means you have a 60% margin on the power consumption, quite a difference from the 175 Watts (at 1 Volt again, just guessing here that those parts have their peak @ the stock voltage more or less) on the GF110, which actually represents a -50% margin more or less.
 

Timorous

Golden Member
Oct 27, 2008
1,989
3,918
136
This is VERY relevant to the current discussion.

Personally I don't see self-induced throttling to be a bad thing, its happening to protect the hardware for a good reason.

But the fact that it happens does not mean it is desirable from an end-user standpoint.

If I bought any peice of hardware and it systematically throttled under my usage scenarios I'd be looking to the manufacturer to build a more robust product.

When people's AM2 mobos started dying from those 140W TDP Phenom CPU's (mobo's weren't designed to work with more than 125W TDP CPUs) the issue was definitely real and problematic to the end-user.

It would appear that both the HD6990 and GTX590 could stand to use an even more robust PCB design (the GTX590 more so) such that the headroom is elevated above its current levels.

I'd be really surprised if neither AMD nor Nvidia are learning from this product cycle and future products are going to be all the more robust (which means all the more expensive, there is a trade-off in all this).

There is a very important difference here though and that is the 6990 is downclocking to stay within a set power envelope rather than to protect itself from harm. The fact that you see stable overclocks at or above 6970 speeds shows that the PCB, PWM system and GPU's themselves can handle the increased power demands without bricking themselves.
 
Feb 19, 2009
10,457
10
76
6990 runs at stock 1.1v i think, with all the volterra VRM it can deliver ~700W total (320W @1v is per core), well beyond anything the card uses, even OC to the max. It's an overpowered engineering piece.. suitable for welding because of its extreme robustness. :D

Edit: The 6000 series adapt power use and throttle to fit your powertune setting. If you set it at -20% it will downclock. If you set it at 20%, it can handle any beefy OC without downclocking. AMD allows you the choice and makes hardware to enable your choice to be safe.
 

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
it was combo of way too much much voltage and a beta driver not intended for use on this card.

“The few press reports on GTX 590 boards dying were caused by overvoltaging to unsafe levels (as high as 1.2V vs. default voltage of 0.91 to 0.96V), and using older drivers that have lower levels of overcurrent protection. Rest assured that GTX 590 operates reliably at default voltages, and our 267.84 launch drivers provide even more additional levels of protection for overclockers. For more information on overclocking and overcurrent protection on GTX 590 please see our knowledge base article here: http://nvidia.custhelp.com/cgi-bin/n...p_faqid=2947.”

Overcurrent protection shouldn't be handled by the drivers. Drivers can crash, contain bugs, or otherwise fail to communicate with the card, or not react quickly enough. What if you have an unstable overclock on your CPU so it fails just when the videocard desperately needs current reduction? You might wand to use your card with a tweaked or third-party driver and/or a different OS like Linux where someone screwed up this particular feature.

It should be handled by the hardware itself at the lowest level possible.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
1.025v @772 is though. Which is the voltage settings used in the video. In the OP.

Using drivers 267.52
whoops.jpg


Press release drivers were 267.71's as per Ryan Smith here at AT.
Like I said, the problem lies with using the pre-press drivers.

and what bench were they running. Methinks Furmark. That's what most use to check stability, or MSI afterburner which has Furmark sort of built in. Someone should ask them. It would have been cool if they had shown the screen while running the video.
 
Last edited:

Dudler

Junior Member
Mar 20, 2009
18
0
66
267.52 were the driver included in the BOX. With the card.

What do you mean with "pre-press"?