SweClockers: Geforce GTX 590 burns @ 772MHz & 1.025V

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Not what I asked.

What is unfortunate here is you offer nothing to support your view beyond a link to one editorial from a website and their findings in testing one card, that and accusing reviewers and end-users who have had cards blow up of conspiracies or lying.

You offer nothing and ask a lot of questions. You have little to no ground to stand on with your viewpoint. Accusations of lying are not much of a foundation to stand on with users going so far as to post pictures of their faults, youtube videos of the failure occurring and credible review sites describing the failures with the 590 they experienced.

And in the face of that you have.. accusations of conspiracies and lying.



Review sites with dead exploding GTX 590s :

http://www.xtremesystems.org/forums/...9&postcount=37

Two cards dead for these reviewers
http://translate.google.ca/translate?hl=en&sl=ro&tl=en&u=http%3A%2F%2Flab501.ro%2Fplaci-video%2Fnvidia-geforce-gtx-590-studiu-de-overclocking%2F12

Two cards dead for these reviewers as well
http://www.sweclockers.com/artikel/1...boven-i-dramat
http://www.youtube.com/watch?v=sRo-1VFMcbc

http://www.techpowerup.com/reviews/A...TX_590/27.html

http://tbreak.com/tech/2011/03/zotac-gtx-590-review/3/

http://www.hardware.fr/articles/825-...e-gtx-590.html

8 dead exploding GTX 590s in the hands of reviewers.


End-users with dead exploding GTX 590s:

1.
http://www.xtremesystems.org/forums/showpost.php?p=4798645&postcount=295

5573788233_7113fcf29f_b.jpg



2.
http://translate.google.com/transla...ulletin/showpost.php?p=3960849&postcount=352'

quemado.jpg



3.
http://forums.overclockers.co.uk/showpost.php?p=18781867&postcount=140

29032011018.jpg



4.
http://hardforum.com/showpost.php?p=1037046423&postcount=170
http://hardforum.com/showpost.php?p=1037047539&postcount=180


All this and the card has not even been out for a week.


And in the face of this you have a link to a website opinion piece stating in their review their card was ok. And your accusations and paranoia that all of the above instances of GTX 590s exploding in the same manner is a conspiracy.

?

So you have nothing tangible at all. Gotcha.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
I agree. I would think that would be mandatory by now, but I guess not. I'm not electronics expert, but I don't think it would be hard to implement.

Don't most of their newer cards use a hardware solution? Why would they go to a software solution for this of all cards? Or is it actually that the software controlling the hardware was faulty?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
is it actually that the software controlling the hardware was faulty?

I'd say they designed the card with weak VRM's - a cheaper off-brand compared to Volterra.
Volterra is whats on the GTX280 and HD5870, among other great cards.

So they used this weak generic hardware on the PCB. (I'm not knocking the GF110, as it's by far the best GPU, and the GTX580 is an excellent card). They also used barely enough of them to scrape by. In instances where the GPU is pulling higher wattage, the voltage circuitry on the GTX590 is reaching 80-90% of it's rated throughput.

The labs501 Romanian article explains it nicely. AMD chose pcb components on their reference design (slave/master vrms, caps, mosfets & inductors) that are of high quality and high effeciency while Nvidia chose bargain basement crap. It's like comparing a Corsair Hx1000 PSU to a no-name $35 800watt PSU. AMD's Cayman GPU pulls less wattage than Nvidia's GF110 GPU. Ironically, Nvidia gives each GF110 five cheap-o 35amp inductors on the GTX590 pcb. AMD provides each Cayman with four high quality 80amp inductors on the HD6990 pcb. Totaling 5x35 = 175 "Kia Sephia" amps on the GTX 590 compared to 4x80 = 320 "Dodge Ram Heavy duty 3500 dually" amps on the HD6990.

When either the 590 or 6990 start to pull 350,400,450, or 500 watts (via overclocking, overvolting, max or peak powersurging - or even the drivers not properly controlling throttling or OCP) the 6990 is a lot more likely to handle the heat & power than the 590.

The 6990 may throttle too, but it's a lot less likely to go BOOM!
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Someone in another thread? maybe or this one? (was one about the 590's) CBA to look back though the pages, mentioned that he noticed that most of the cards peopel "baked" to get working again where nvidia cards.

The question is... does nvidia design cards to last as long as AMD/ATI does/did?
Or do they skimp on quality to make them cheaper/more profitable?
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Someone in another thread? maybe or this one? (was one about the 590's) CBA to look back though the pages, mentioned that he noticed that most of the cards peopel "baked" to get working again where nvidia cards.

That was me. :)

About the quality thing, only the engineers working on the cards could probably say, but I think for the 590 vs 6990 comparison, it looks like AMD chose better components, as Tempered81 mentions.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I guess Nvidia had to make up for the the price of the huge GPU die in the rest of their BOM. So they cheaped out on the rest of the components. :/
 

TerabyteX

Banned
Mar 14, 2011
92
1
0
Well said GrooveRiding, he's downplaying the importance of this issue instead of searching for solutions or data that can backup his POV and prove right or wrong this issue because wasting $700.00 on a very powerful card that won't last more than 3 days isn't something for the faint of heart.
 
  • Like
Reactions: Grazick

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
What is unfortunate here is you offer nothing to support your view beyond a link to one editorial from a website and their findings in testing one card, that and accusing reviewers and end-users who have had cards blow up of conspiracies or lying.

...

So you have nothing tangible at all. Gotcha.
Well said, I'm waiting for keysplayer's response as well.
Someone in another thread? maybe or this one? (was one about the 590's) CBA to look back though the pages, mentioned that he noticed that most of the cards peopel "baked" to get working again where nvidia cards.

The question is... does nvidia design cards to last as long as AMD/ATI does/did?
Or do they skimp on quality to make them cheaper/more profitable?
That was me. :)

About the quality thing, only the engineers working on the cards could probably say, but I think for the 590 vs 6990 comparison, it looks like AMD chose better components, as Tempered81 mentions.
I know I've mentioned it several times, maybe not on this forum, but the 8800 debacle back in the day caused me a lot of headache having to take care of machines I built for others. I replaced my GTX 295 several times as well, although I don't think those dying of heat was widespread (although it was reported). However, the GTX 570 problems and now the GTX 590's really points out that there's some QC issues here.
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
The VRM is a DC to DC converter, it converts the 12V from the PSU to 0.938V the GTX590 needs to operate at 607MHz. If you only downclock the GPUs to 550MHz without reducing the voltage, at full load the Voltage controller of the VRM will try to give 0.938V through the MOSFET to the GPUs and so the MOSFET will continue to work the same as before. Just because you downclock the frequency doesn’t mean that the Voltage controller will lower the voltage.

The cards BIOS has Voltage values for each stage the card is in, for example if the card is in 2D then the frequency of the GPU goes down to 51MHz and it lowers the voltage too. But when is in 3D mode, the clocks go up to 607MHz with 0.938V. You can lower the GPUs frequencies (say to 550MHz) but if you don’t reduce the voltage the MOSFET at 3D mode will operate at the same conditions as before.

We do know that GTX590 VRM implementation is not designed for O/C and it will be sensible not to over-voltage with the NV reference PCB. That’s why I keep telling to wait for custom PCBs from the AIBs.

It would be ideal to have 12 VRMs for each GF110 chip (24 in total) in the GTX590 (like MSI GTX580 Lighting image bellow) but the PCB would have to be bigger and the cooling solution would have to be redesign and be bigger, heavier and more expensive in order to be able to keep up with the higher TDP the new card and the VRMs would generate. Then the card will cost $1000 and only extreme users and overclockers will buy it (Like ASUS ARES).

NV reference GTX590 is not for extreme users and overclockers, plain and simple.

4614d8d2d3c3450d.jpg

you are talking about what it can draw not what happens in actual circumstances. The card will draw power according its tables that's in the bios. Every performance entry has a matching and target entry which it will select.
 

LiuKangBakinPie

Diamond Member
Jan 31, 2011
3,903
0
0
The VRM is a DC to DC converter, it converts the 12V from the PSU to 0.938V the GTX590 needs to operate at 607MHz. If you only downclock the GPUs to 550MHz without reducing the voltage, at full load the Voltage controller of the VRM will try to give 0.938V through the MOSFET to the GPUs and so the MOSFET will continue to work the same as before. Just because you downclock the frequency doesn¡¯t mean that the Voltage controller will lower the voltage.

The cards BIOS has Voltage values for each stage the card is in, for example if the card is in 2D then the frequency of the GPU goes down to 51MHz and it lowers the voltage too. But when is in 3D mode, the clocks go up to 607MHz with 0.938V. You can lower the GPUs frequencies (say to 550MHz) but if you don¡¯t reduce the voltage the MOSFET at 3D mode will operate at the same conditions as before.

We do know that GTX590 VRM implementation is not designed for O/C and it will be sensible not to over-voltage with the NV reference PCB. That¡¯s why I keep telling to wait for custom PCBs from the AIBs.

It would be ideal to have 12 VRMs for each GF110 chip (24 in total) in the GTX590 (like MSI GTX580 Lighting image bellow) but the PCB would have to be bigger and the cooling solution would have to be redesign and be bigger, heavier and more expensive in order to be able to keep up with the higher TDP the new card and the VRMs would generate. Then the card will cost $1000 and only extreme users and overclockers will buy it (Like ASUS ARES).

NV reference GTX590 is not for extreme users and overclockers, plain and simple.

4614d8d2d3c3450d.jpg

you are talking about what it can draw not what happens in actual circumstances. The card will draw power according its tables that's in the bios. Every performance entry has a matching and target entry which it will select.

Bloody network apologies for the double post
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
many greets from nederlands i come to ask gtx 590 safe buy ? right now i have 295 and want new card. nederlands cpu site say new gtx 590 not safe to buy http://tweakers.net/nieuws/73585/nvidias-gtx-590-lijkt-met-ernstig-probleem-te-kampen.html not good i think then gtx 580 then is ok sorry for english. thanks

Some people say it's safe to buy as long as you run it at stock speeds and use the latest available drivers from nvidia.

If it was me I'd wait it out awhile longer and see what developes.
 

pcm81

Senior member
Mar 11, 2011
598
16
81
The VRM is a DC to DC converter, it converts the 12V from the PSU to 0.938V the GTX590 needs to operate at 607MHz. If you only downclock the GPUs to 550MHz without reducing the voltage, at full load the Voltage controller of the VRM will try to give 0.938V through the MOSFET to the GPUs and so the MOSFET will continue to work the same as before. Just because you downclock the frequency doesn’t mean that the Voltage controller will lower the voltage.

The cards BIOS has Voltage values for each stage the card is in, for example if the card is in 2D then the frequency of the GPU goes down to 51MHz and it lowers the voltage too. But when is in 3D mode, the clocks go up to 607MHz with 0.938V. You can lower the GPUs frequencies (say to 550MHz) but if you don’t reduce the voltage the MOSFET at 3D mode will operate at the same conditions as before.

We do know that GTX590 VRM implementation is not designed for O/C and it will be sensible not to over-voltage with the NV reference PCB. That’s why I keep telling to wait for custom PCBs from the AIBs.

It would be ideal to have 12 VRMs for each GF110 chip (24 in total) in the GTX590 (like MSI GTX580 Lighting image bellow) but the PCB would have to be bigger and the cooling solution would have to be redesign and be bigger, heavier and more expensive in order to be able to keep up with the higher TDP the new card and the VRMs would generate. Then the card will cost $1000 and only extreme users and overclockers will buy it (Like ASUS ARES).

NV reference GTX590 is not for extreme users and overclockers, plain and simple.

4614d8d2d3c3450d.jpg

This is a good start, but the problem is: it will not work very well, although it will stop 590s from going kaboom. They need a 3rd 8-pin connector on top to get the card the juice it needs. with 2x8-pin and PCI2 slot you still only getting 450W of electrical power.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
This is a good start, but the problem is: it will not work very well, although it will stop 590s from going kaboom. They need a 3rd 8-pin connector on top to get the card the juice it needs. with 2x8-pin and PCI2 slot you still only getting 450W of electrical power.

why would you buy this over 2 separate GTX570 or 580 in SLI?
 

pcm81

Senior member
Mar 11, 2011
598
16
81
why would you buy this over 2 separate GTX570 or 580 in SLI?

The design as it is show in picture, I would not buy. If they add 3rd 8-pin connector the only reason to buy it over gtx580SLI would be the pci-e slot count, possibly price if it costs less than 2x580s...
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The design as it is show in picture, I would not buy. If they add 3rd 8-pin connector the only reason to buy it over gtx580SLI would be the pci-e slot count, possibly price if it costs less than 2x580s...

according to the reviews, the GTX590 achieves equal performance to 2x GTX570 (which are quieter, cheaper, etc). http://www.anandtech.com/show/4239/nvidias-geforce-gtx-590-duking-it-out-for-the-single-card-king/5
A pair of GTX580 outperform it by a lot.

As for slot count... if you are throwing this kind of money around you can afford a mobo with 2 slots. This would have only made sense in a more budget oriented market... aka, a 250$ card with 2xGTX550 onboard. A budget card for those who deal with the issues of SLI, in exchange for much more performance/$. and where the cost of mobo replacement actually matters.
 
Last edited:

pcm81

Senior member
Mar 11, 2011
598
16
81
according to the reviews, the GTX590 achieves equal performance to 2x GTX570 (which are quieter, cheaper, etc). http://www.anandtech.com/show/4239/nvidias-geforce-gtx-590-duking-it-out-for-the-single-card-king/5
A pair of GTX580 outperform it by a lot.

As for slot count... if you are throwing this kind of money around you can afford a mobo with 2 slots. This would have only made sense in a more budget oriented market... aka, a 200$ card with 2xGPUs onboard. A budget card for those who deal with the issues of SLI, in exchange for much more performance/$. and where the cost of mobo replacement actually matters.

GTX590 with proper VRMs and enough juice should perform the same as 580SLI. This is why I said I would not buy that design, with 2 connectors, it cant deliver enough juice to make 590 be the same performance as 2x580s. When I spoke of slot count I was referring to using 2x590s in quad-sli and having 1 or more slots on MOBO available, vs going tri-sli with 3x580s...

590 at decent clocks will need water cooling, so we are not talking about using a 590 for anything other than ultra-high-end setup.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
GTX590 with proper VRMs and enough juice should perform the same as 580SLI.

I disagree, 2 GTX580 will cool a LOT better than a GTX 590 ever could. If you give it ridiculous VRMs and have it pull 600 watts for that single card it will be limited by heat.
In fact, the card is already hitting max temps (90C) on a "mere" 375 watts.

When I spoke of slot count I was referring to using 2x590s in quad-sli and having 1 or more slots on MOBO available
Fair enough. I agree that this would be a valid use for it. But what I said was why use GTX590 vs 2x GTX580 or 2x GTX570. Not why use 2xGTX590 vs 3x GTX580 or 3x GTX570
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
why would you buy this over 2 separate GTX570 or 580 in SLI?


GTX570 SLI has the same price as GTX590
GTX570 SLI has almost the same performance with GTX590 (perhaps 5% more in some games).
GTX570 has the same problems in O/C with the VRMs as GTX590
GTX570 SLI uses more power than GTX590
GTX570 SLI produces more noise than GTX590
GTX570 SLI needs SLI mobo, GTX590 doesn't

It seams GTX590 to be the better product between the two.

GTX580 SLI cost $250-300 more than GTX590
GTX580 SLI is faster
GTX580 SLI can O/C more
GTX580 SLI uses more power
GTX580 SLI produces more noise than GTX590
GTX580 SLI needs SLI mobo, GTX590 doesn't

You spend more you get better performance ;)

Edit: noise and not nose ehehehe
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
GTX590 with proper VRMs and enough juice should perform the same as 580SLI. This is why I said I would not buy that design, with 2 connectors, it cant deliver enough juice to make 590 be the same performance as 2x580s. When I spoke of slot count I was referring to using 2x590s in quad-sli and having 1 or more slots on MOBO available, vs going tri-sli with 3x580s...

590 at decent clocks will need water cooling, so we are not talking about using a 590 for anything other than ultra-high-end setup.
Thats fine, but you did buy something very similar with its own set of limitations. The 6990 already pulls more than any specification limit, with its 2-8 pin.
In the AMD slides they mention 450 watts, thats new territory in itself. If you wanted to o/c it further, you would be in uncharted 'good luck' territory also.
Thats why high end overclocking motherboards have beefier power circuits. And high end psu's use thicker gauge wire.
This m/b allows for added power to the pci-e slot through extra power connections to alleviated the strain on the 24 pin power connector. Which can and have melted in extreme situations.
http://www.guruht.com/2011/03/gigabyte-ga-x58a-oc-designed-for.html
OC-PEG
OC-PEG provides two onboard SATA power connectors for more stable PCIe power when using 3-way and 4-way graphics configurations. Each connector can get power from a different phase of the power supply, helping to provide a better, more stable graphics overclock. The independent power inputs for the PCIe slots helps to improve even single graphics card overclocking. For 4-way CrossFireX™, users must install OC-PEG to avoid over current in the 24pin ATX connector. The entire board also features POScaps, helping to simplify the insulation process so overclockers can quickly reach subzero readiness.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
@AtenRa
I looked up the prices. the GTX570 SLI is cheaper, not more expensive.

As for noise levels:
http://www.anandtech.com/show/4239/nvidias-geforce-gtx-590-duking-it-out-for-the-single-card-king/16

The GTX570 SLI is a lot quieter when idle, while the GTX580 SLI is slightly louder. In games the GTX580SLI is slightly louder, the GTX570SLI was untested, presumably its still a lot quieter.
However, this all changes in futuremark where the nvidia drivers detect it as being a benchmark suite and cheat, by extremely throttling it. In futuremark the GTX590 is a lot quieter... but that is because it cheats on benchmarks.
Note that anything released from nvidia since the GTX580 also cheat on futuremark, just not as much.
http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/3
NVIDIA’s reasoning for this change doesn’t pull any punches: it’s to combat OCCT and FurMark. At an end-user level FurMark and OCCT really can be dangerous – even if they can’t break the card any longer, they can still cause other side-effects by drawing too much power from the PSU. As a result having this protection in place more or less makes it impossible to toast a video card or any other parts of a computer with these programs. Meanwhile at a PR level, we believe that NVIDIA is tired of seeing hardware review sites publish numbers showcasing GeForce products drawing exorbitant amounts of power even though these numbers represent non-real-world scenarios. By throttling FurMark and OCCT like this, we shouldn’t be able to get their cards to pull so much power. We still believe that tools like FurMark and OCCT are excellent load-testing tools for finding a worst-case scenario and helping our readers plan system builds with those scenarios in mind, but at the end of the day we can’t argue that this isn’t a logical position for NVIDIA.

Your conclusion that with GTX580 SLI you are paying more to get more is sensible and true; I was genuinely surprised to see that the GTX580 SLI was slightly louder than the GTX590, I had mistakingly assumed it would be quiter. Also the power consumption savings cannot be denied... although you could always downclock and undervolt.
 
Last edited: