[vr-zone] GTX 590 revision in June

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Jionix

Senior member
Jan 12, 2011
238
0
0
No, just an estimation based on the power usage of 580 chips, and the rated power usage of the 590.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
you cannot compare the two cards, different frequencies, different VRM hardware, different VRM design etc etc.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
So i will guess that you don't have a clue about the technical/electrical characteristics and the power design of the GTX590.

Why don't you just try to educate him instead of challenging his knowledge?
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
So i will guess that you don't have a clue about the technical/electrical characteristics and the power design of the GTX590.

Reading this post it seems like you give the illusion of superior knowledge

because we don't have the data im asking for ??? next time try not to be a smart a**.

Grumpy today? It doesn't look like a smart a** comment to me....But on the other hand yours does!
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You people dont speculate, you are sure the VRM was under power without knowing the specs. You and more people saying the card is under power and the VRM was broken etc etc because W1zzard blow up the MOSFET at 1.2V.

I don't want to start the same again with the GTX590 VRM thing.
 

Jionix

Senior member
Jan 12, 2011
238
0
0
Well, without digging up that corpse, the VRMs were detailed and they were shown to be inadequate.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Well, without digging up that corpse, the VRMs were detailed and they were shown to be inadequate.
Only by certain fans on this message board by examining board components. This is something Nvidia's engineers wouldn't have done at hundreds of different stages of design and testing./sarcasm

edit: And before the comment, why was there any problems is posed. The same reason any bugs slip through. Its a combination of hardware and software, and then hundreds of other variables once out in the field.
But the actual hardware for the intended use is not faulty or wrongly chosen.
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
The GTX 590 wasn't designed to be a super uber overclocker from the getgo. The PCB tells all.

The whole purpose of the card was to give a person the option of running a single card with the ability to use features such as 3 monitors, 3d gameplay with respectable framerates.

It's what happened after the initial design that created or exposed the inherant problems with the card. What happened who knows for sure? Pushed clocks to be more competetive with the 6990? Bad run of parts? Defective drivers? E-Peen syndrome?

Time will tell if it's revised or not. If it's been revised it'll be easy to see on the PCB what has been changed, added, or removed.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What are you arguing here? That Nvidia was limited by power envelope? By price? Or both?

Either way, you admit that Nvidia under-built the 590. They tried to fit $1,000 worth of performance in a $730 budget. They also tried to fit 600 watt of power into a 350-400 watt design. And look what happened --- A card that is fragile and temperamental, that uses more power than the competition's card, doesn't even beat said card, and is likely to blow up if you try and unleash it's true power.

Why would they try to offer 1000 dollars worth of performance into a 700 dollar budget?

Is this the way you define power of deduction?

The way I looked at it was try to offer GTX 570 sli performance and wattage for that 700 dollar price-target but offering more shader cores and memory - for added value -- specifically multi-monitor gaming.
 

Jionix

Senior member
Jan 12, 2011
238
0
0
Only by certain fans on this message board by examining board components. This is something Nvidia's engineers wouldn't have done at hundreds of different stages of design and testing./sarcasm

That may be true, but couldn't Nvidia have ended up reducing component costs where they could, and just so happen that they castrated the card because of said cost reductions? At default, the card runs fine (?), but it is not a good overclocker, and is dangerous if you try. Also, it has yet to be seen whether the 590 will hold up to the daily uses their owners put them through. Maybe they have a shortened lifespan?

AMD made it well known and detailed how they engineered the 6990. Nvidia did not do the same for the 590. Maybe they don't want to talk about it...
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Well, without digging up that corpse, the VRMs were detailed and they were shown to be inadequate.

If they were -- where are all the blow ups since the confusion with the abilities of what the product can do with over-volting?

You have the powers of deduction.

If the VRMs were inadequate -- then they would still be blowing up.
 

Jionix

Senior member
Jan 12, 2011
238
0
0
And I assume only the RMA departments of the tech stores and vendors would have a true idea on that.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
That may be true, but couldn't Nvidia have ended up reducing component costs where they could, and just so happen that they castrated the card because of said cost reductions? At default, the card runs fine (?), but it is not a good overclocker, and is dangerous if you try. Also, it has yet to be seen whether the 590 will hold up to the daily uses their owners put them through. Maybe they have a shortened lifespan? AMD made it well known and detailed how they engineered the 6990. Nvidia did not do the same for the 590. Maybe they don't want to talk about it...

AMD pushed 40nm as far as they can with the 6990....28nm is gonna be interesting :)

Nvidia produced the 590 not to be a super uber overclocking death blow to AMD more like to give the abillity to use the Nvidia features such as 3d gaming, etc on a single card.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
No, just an estimation based on the power usage of 580 chips, and the rated power usage of the 590.

They're using similar cores but they're not the same clocking. The double edged sword aspect to this sku -- is the rampant belief that the product can not only offer GTX 580 SLi performance but must offer GTX 580 SLI OC performance.

I can understand why nVidia stopped the aggressive over-volts but sadly, they did take away modest over-volting and may of over-reacted -- and probably feel that there is enough head room with default volting and over-clocking.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Probably the best reason for RElaunching the 590 is that Nvidia wont be launching any 28nm cards within the same quarter AMD does. If they are lucky, and by that i mean as in real superduperlucky of the type that stars align and we can see the moon in all its 360% beauty, they may touch the 3 months after in a review...

So yeah, please do this turd of a card some justice and release it in a fixed state, and an apoligy wouldnt be out of order either.

Edit: did i screw up with the %?
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Probably the best reason for RElaunching the 590 is that Nvidia wont be launching any 28nm cards within the same quarter AMD does. If they are lucky, and by that i mean as in real superduperlucky of the type that stars align and we can see the moon in all its 360% beauty, they may touch the 3 months after in a review...

So yeah, please do this turd of a card some justice and release it in a fixed state, and an apoligy wouldnt be out of order either.

Edit: did i screw up with the %?

Do you have some sort of timeline on Kepler that the rest of us don't? Do share...
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Fermi launched in the workstation Quadro line and gtx 480/470, simultaneously. Final design and production began in 2009. Wafer production stalled actual product launches. Fermi's design may have added to bad yields, but not enough gpu's is what delayed the launch.
You know wafer production issues the same excuse why Northern/Southern Islands got bumped to 40nm and all the other shortcomings that came with it.
 
Status
Not open for further replies.