[VR-Zone] NVidia GTX-590 *FINAL* Specs Revealed!

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
These are two common misconceptions. The 8-pin PCIE power connector is rated for 150W (as a spec), but it can put out much more. As I already stated, the wires in it are certified for 75W each, for a total of 225W, and can probably push a little more before they get too hot. This is probably why AMD doesn't warranty the 6990's AUSUM mode, since it forces the card to run out of spec.

The PCIE 2.0 slot also does not provide 150W of power. This is a common error that was spread and I guess is still floating around two+ years later. The PCIe 1.1 and PCIe 2.0 power deliver is the same as the slots are the same. You can negotiate higher speeds across data pins but there's no way to (safely) push more power across the same power pins.
You guess alot
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
b53450757bdea166462c4247e350e182.jpg


Asian site had a review up and pulled it. Core clock = 607Mhz.

http://translate.googleusercontent....gle.ca&usg=ALkJrhhWX6pStAwx1DWo_Vyrr5-v2OHgqA

I saw 38% OC possible?...could be all about the noise now!
 

Morg.

Senior member
Mar 18, 2011
242
0
0
Yeah, I stopped reading toms after that odd ram article and the crossfire vs SLI article where they declared Crossfire the winner after 4 games.

Toms failed the mention that the 6990 runs cooler than the 480, 470, 580, 570, 6970 and 6950.

I stopped reading THG when they were bought by Intel --

No kidding you can clearly see the difference between the real original THG and the time they were bought ... switching to an all Intel propaganda.
 

Morg.

Senior member
Mar 18, 2011
242
0
0
IIRC, each 12V wire is rated for 75W, and there's three per connector, so technically each connector is good for 225W. So the board is geared for 75W (PCIe slot) + 225W + 225W power consumption = 525W total power consumption. However, it's important to note here that this is an actual hardware limitation, and extreme overclocks, which will definitely pull more than that, will actually be dangerous. The fact that a "factory" overclocked 6990 (via AUSUM BIOS) consumes ~440W does not leave a lot of headroom.

Thanks for the info, it's extremely helpful :thumbsup:. There looks to be a significant deficit there. If you have time, would it be possible to bench one of your GTX 480's at those speeds to we can look at scaling? I wonder if the reduce clocks would also effect that.

"Mixed reviews" is one review to you? [H] seems to disagree:


The point is, the 6990 offers currently unparalleled performance in a single card, and is actually quite power efficient:
http://www.techpowerup.com/reviews/ASUS/Radeon_HD_6990/23.html
You'll notice that the 6990 is still more power efficient than any card offered by NVIDIA @ 2560x1600 (and if you're buying a 6990 to play at a lesser resolution, "you're doing it wrong"). Its main problem is noise, which by all reports is ungodly.
That is impressive if NVIDIA pulls it off :thumbsup:. The axial style fan will most certainly be quieter than AMD's radial design. It's interesting how tables have turned this time: the GTX580 might end up being a decent second place card if it's quieter and performs only a bit slower. Also, the smaller footprint is more than enough to win some over. Very interesting :cool:.

I don't think so. The only thing that sucks on the 6990 is the reference cooling solution, which will be replaced quickly, plus @ 700 a card you can throw in a WC and still not feel like you paid much for that +50% overclock (j/k).

Also, when you say power efficient, I read : "room for lots of overvolting" -> "damn nice potential for o/c" .. as if I ever were to buy a card at that price I'd run it @ 90°C on dual water loop for sure xD.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I stopped reading THG when they were bought by Intel --

No kidding you can clearly see the difference between the real original THG and the time they were bought ... switching to an all Intel propaganda.

Although I haven't read Tom's in at least five years, do you have any proof of the above statement?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
You guess alot
Where do you think there's guessing? It's all in the PCI-SIG documents, go read them if you're not sure.
I don't think so. The only thing that sucks on the 6990 is the reference cooling solution, which will be replaced quickly, plus @ 700 a card you can throw in a WC and still not feel like you paid much for that +50% overclock (j/k).

Also, when you say power efficient, I read : "room for lots of overvolting" -> "damn nice potential for o/c" .. as if I ever were to buy a card at that price I'd run it @ 90°C on dual water loop for sure xD.
Like I said though, it's not going to happen. Power delivery is a very real problem with this card, especially when considering overclocking. You can't just slap on better cooling and think that will solve everything, or just will it into being faster. 525W is the hard limit and pushed past that you risk permanent damage to components, not just your card but your entire system if the PSU takes it out. Or the entire thing dies in a giant conflagration :p.
I stopped reading THG when they were bought by Intel --

No kidding you can clearly see the difference between the real original THG and the time they were bought ... switching to an all Intel propaganda.
Yeah, it was ridiculous. However, it's obviously a great business model as they're still a popular tech website. It's amazing what people will watch and listen to because they're too lazy to do the research themselves. I guess that validates the existence of most news channels, but I digress :p.

This was posted over at [H]. All I can say is "OUCH":
74713620.jpg

If these are the top 7 best benchmarks they could put forward, yikes. They must be the only situations where this card tops the 6990 if it's only slightly faster than the GTX 580 like these tests show. At 1920x1200, no less :rolleyes:
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
<snip>
If these are the top 7 best benchmarks they could put forward, yikes. They must be the only situations where this card tops the 6990 if it's only slightly faster than the GTX 580 like these tests show. At 1920x1200, no less :rolleyes:
Yup, even worse than the typical company benchmarks which show best case. These don't even show good case. 2 benchmarks where there's appreciable 580 -> 590 scaling? What a joke.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Yup, even worse than the typical company benchmarks which show best case. These don't even show good case. 2 benchmarks where there's appreciable 580 -> 590 scaling? What a joke.
Exactly. In all fairness I could throw in the possibility that these are fakes, we'll see soon enough :thumbsup:.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If these are the top 7 best benchmarks they could put forward, yikes. They must be the only situations where this card tops the 6990 if it's only slightly faster than the GTX 580 like these tests show. At 1920x1200, no less :rolleyes:

Not trying to rain on your hate parade or anything, but Civ V, Lost Planet 2, Battlefield 2, Dirt 2, most UT3 games including Mass Effect 2, and Battleforge 2 are all games that bench better (some CONSIDERABLY better) on Nvidia hardware. They could have (but didn't) chosen to use those games in their slides, which would have showed some significantly different results. Interestingly, though, you probably would have just called it propaganda and said exactly the same thing that you are saying now.

I think it's going to be 5&#37; slower on average, but will be way quieter and run cooler with the reference cooling solution. Dare I say I think it will also consume less power. That is an all around better card. Either way, it doesn't matter what you or I think right now, we'll find out soon enough.


tviceman, the rhetoric in this post is inflammatory and does not foster an environment conducive to productive discourse.

Please avoid posting in this inflammatory manner in the future. Everyone in the community thanks you in advance.

Idontcare
Super Moderator
 
Last edited by a moderator:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Not trying to rain on your hate parade or anything, but Civ V, Lost Planet 2, Battlefield 2, Dirt 2, most UT3 games including Mass Effect 2, and Battleforge 2 are all games that bench better (some CONSIDERABLY better) on Nvidia hardware. They could have (but didn't) chosen to use those games in their slides, which would have showed some significantly different results. Interestingly, though, you probably would have just called it propaganda and said exactly the same thing that you are saying now.

I think it's going to be 5&#37; slower on average, but will be way quieter and run cooler with the reference cooling solution. Dare I say I think it will also consume less power. That is an all around better card. Either way, it doesn't matter what you or I think right now, we'll find out soon enough.
So is testing games at 1920x1200, which is typically the highest resolution that NVIDIA still typically holds a lead in. I'd love to see you explain why NVIDIA didn't put its best foot forward and benchmark those aforementioned games instead? Let me guess, they knew I'd be there to mythbust them! Anyway, continue to find comfort in your personal attacks :thumbsup:.


MrK6, publicly characterizing a fellow forum member's post "personal attack" is inflammatory and does not foster an environment conducive to productive discourse.

If you feel a member's post is in violation of the posted guidelines then please just report it and leave it at that. Responding publicly only serves to escalate the situation into an unproductive situation.

Everyone in the community thanks you in advance.

Idontcare
Super Moderator
 
Last edited by a moderator:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
So is testing games at 1920x1200, which is typically the highest resolution that NVIDIA still typically holds a lead in. I'd love to see you explain why NVIDIA didn't put its best foot forward and benchmark those aforementioned games instead? Let me guess, they knew I'd be there to mythbust them! Anyway, continue to find comfort in your personal attacks :thumbsup:.

What was the personal attack? Please explain it to me and then feel free to report it.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I think it's going to be 5% slower on average, but will be way quieter and run cooler with the reference cooling solution. Dare I say I think it will also consume less power. That is an all around better card. Either way, it doesn't matter what you or I think right now, we'll find out soon enough.

Nah, these cards are about being fast. This sort of card is not about noise or heat or moderation. If that is your concern, you pick up a 460,560,6870 etc.

Whichever is faster will be the better card in this showdown.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
So is testing games at 1920x1200, which is typically the highest resolution that NVIDIA still typically holds a lead in. I'd love to see you explain why NVIDIA didn't put its best foot forward and benchmark those aforementioned games instead? Let me guess, they knew I'd be there to mythbust them! Anyway, continue to find comfort in your personal attacks :thumbsup:.

What was the personal attack? Please explain it to me and then feel free to report it.

Gentlemen, please take the rhetoric down a notch, the rest of the readers in this forum are interested in learning from your discussion but we are not interested in wading through the hyperbole and personal side-stuff.

The tit-for-tat escalation going on here needs to stop, please get back to the meat of your disagreement by keeping your posts technical and debating the contents of the posts (avoid getting personal). Please, for the benefit of everyone here, we implore you.

Idontcare
Super Moderator
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Overclocking included?

Overclocking is random, sporadic and widely variable.

I'm sure sites will show their various overclocking results like they do with any new card release.

Whatever the card is in its stock form will be the standard as always, right ? ():)
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
As i have said before, at ~600MHz the GTX590 will be close to GTX570 SLI in performance so at 1920x1200 it will lead in some benchmarks like Battleforge, HAWX, HAWX2, LP2, Civ V, BF2 BC, DIRT 2 and Mass Effect 2.

At 2560x1600 the situation will be different and the HD6990 will lead the pack, same with Triple Monitor SetUps.

Finally the best part and what im waiting for is the O/C. If GTX590 can O/C at 800MHz plus then it will regain the performance crown.
 

Morg.

Senior member
Mar 18, 2011
242
0
0
Like I said though, it's not going to happen. Power delivery is a very real problem with this card, especially when considering overclocking. You can't just slap on better cooling and think that will solve everything, or just will it into being faster. 525W is the hard limit and pushed past that you risk permanent damage to components, not just your card but your entire system if the PSU takes it out. Or the entire thing dies in a giant conflagration .

Meh.. the limit for 2x 8 pin + pcie .. I can slap on 2 more 8 pins on a non-reference pcb if you want me to -- well not me but Asus could definitely do something like that.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
As i have said before, at ~600MHz the GTX590 will be close to GTX570 SLI in performance so at 1920x1200 it will lead in some benchmarks like Battleforge, HAWX, HAWX2, LP2, Civ V, BF2 BC, DIRT 2 and Mass Effect 2.

At 2560x1600 the situation will be different and the HD6990 will lead the pack, same with Triple Monitor SetUps.

Finally the best part and what im waiting for is the O/C. If GTX590 can O/C at 800MHz plus then it will regain the performance crown.

33% overclock is asking a lot imho.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
As i have said before, at ~600MHz the GTX590 will be close to GTX570 SLI in performance so at 1920x1200 it will lead in some benchmarks like Battleforge, HAWX, HAWX2, LP2, Civ V, BF2 BC, DIRT 2 and Mass Effect 2.

At 2560x1600 the situation will be different and the HD6990 will lead the pack, same with Triple Monitor SetUps.

Finally the best part and what im waiting for is the O/C. If GTX590 can O/C at 800MHz plus then it will regain the performance crown.

so what the point of this card, if it can't compete in uber resolution ?? i mean why nvdia didn't use GTX 560 TI and use 2 GB of ram and oc it to 1 Ghz ??? and maybe they can price it competitively.


but what ever, now days game developer is disappointing me, countless of console port.
 

TerabyteX

Banned
Mar 14, 2011
92
1
0
You are totally right, lots of console ports. I'm currently playing AC Brotherhood and their controls are horrible. I'm using the Xbox 360 controller and the game's controller behavior is jerky, erratic, Ezio acts like if he had esquizofrenia paranoide. But ok, back on topic.

If its true that the GTX 590 stock speeds will be at 607MHz, it will have trouble outperforming a stock HD 6990, specially at ubber resolutions, but I think that it will overclock well enough to match and even outperform slightly the HD 6990 in some games. What will make the GTX 590 shine will be its power consumption at stock speeds, it will be better than the HD 6990 at load by a small margin, once you crank up the core speed, the power consumption will increase greatly. The GTX 590 will be competitive enough if its priced accordingly.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
This was posted over at [H]. All I can say is "OUCH":
74713620.jpg

If these are the top 7 best benchmarks they could put forward, yikes. They must be the only situations where this card tops the 6990 if it's only slightly faster than the GTX 580 like these tests show. At 1920x1200, no less :rolleyes:

Exactly my thoughts. The only things the 590 appears to have going for it, is the smaller PCB based on the pics earlier in the thread and a potentially quieter cooling solution but with these clocks looking final this is a fail the way I see it. Although it could still be a good card based on the other factors, I am just used to seeing Nvidia always go for maximum performance at all costs and if this doesn't beat the 590 convincingly, then it doesn't even make sense to make it.

The only other argument for it would be triple monitor/single card in an Nvidia solution but that eVGA 460x2 could fill that niche if need be.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Not trying to rain on your hate parade or anything, but Civ V, Lost Planet 2, Battlefield 2, Dirt 2, most UT3 games including Mass Effect 2, and Battleforge 2 are all games that bench better (some CONSIDERABLY better) on Nvidia hardware. They could have (but didn't) chosen to use those games in their slides, which would have showed some significantly different results. Interestingly, though, you probably would have just called it propaganda and said exactly the same thing that you are saying now.
Ad hominem

If you disagree with a point I've made or wish to offer a differing viewpoint, please do so. Belittling my character in an attempt to refute my viewpoint is not an appropriate form of discourse and does not lead to a productive discussion.

IDC, if this was an inappropriate response, please let me know. However, I'm posting it publicly as an example as to what I'd like to see evolve on this forum (and we're getting there!).

I think it's going to be 5&#37; slower on average, but will be way quieter and run cooler with the reference cooling solution. Dare I say I think it will also consume less power. That is an all around better card. Either way, it doesn't matter what you or I think right now, we'll find out soon enough.
Exactly, time will tell. :thumbsup:
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Overclocking is random, sporadic and widely variable.

I'm sure sites will show their various overclocking results like they do with any new card release.

Whatever the card is in its stock form will be the standard as always, right ? ():)

"Standard as always" Sure, I guess, but we both know the gtx460 overclocked like mad (mostly because Nvidia went conservative on clock speeds) and that a very large percentage of people who bought one took that into consideration. I think we're going to find that Nvidia went conservative with clock speeds on the gtx590 as well to make the appearance that they're trying to be conscious of power usage and noise levels.

The point is we know the argument is going to exist, especially if the gtx590 in general DOES overclock well and also since AMD doesn't honor the 6990 warranty if you flip the switch and go for a little extra speed.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
From those gigabyte slides, I think the 590 will be slower than the 6990, in the 2560x resolutions or higher, which is probably what your gameing at if you buy a card like the 6990 or the 590.

I think it ll use as much or more power (from looking at 6970 vs 580).... I think it ll be more expensive too (going by 580 prices x2 vs 6970 x2 prices).

So the main questions left are? does it have lower temps? does it make less noise?