• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

EVGA offers a sneak peek at Nvidia's next dual-GPU monster

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I certainly doubt that they will use two GTX 570, I do think that it would be two GTX 560 since the GF104 chip consumes less power than any GF110 GPU and the GTX 560 is a more power tweaked GF104 version with all the shaders enabled which can certainly rival two GTX 470 in SLI.
 
It's black because that was what EVGA specified solder mask color to be when they sent the order to the manufacturer?

:colbert:
 
What exactly does Nvidia have against DisplayPort that they won't put it on their reference cards?
 
Omg Hot blonde girl sex


This is not L&R...

Moderator Idontcare
 
Last edited by a moderator:
That would be awesome. With NV milking GTX580 at $499, I doubt they will sell this card for less than $699.

If it's two 570's a 699.99 price wouldn't make sense at all. If it's two 580's i could DEF live with 699.99.

This ofcourse could be a 590 and have two GF114's on it and be priced at 399.99. One could only dream
 
Perhaps they feel that the market cap for DisplayPort is too little to justify the added cost.

But that's not their problem. They design the reference card and 90% of manufacturers copy it because they're lazy. They could just put the DP in, and then if the manufacturers don't want the cost they can leave it out.
 
Haha, what?

What did I say that was funny? A gtx570 has about the same TDP as a gtx275. A lower clocked gtx275 is what formed a gtx295.

EDIT: My bad. I confused the gtx580 and gtx570 clock speeds. It would have to be around 675-700mhz for it to make it within it PCIe spec.
 
Last edited:
And this is the backside of the card incase anyone cares.

1294400296.jpeg
 
If it's two 570's a 699.99 price wouldn't make sense at all. If it's two 580's i could DEF live with 699.99.

This ofcourse could be a 590 and have two GF114's on it and be priced at 399.99. One could only dream

If it was two 580s and $699, I'd buy it. I would also buy it if it was two 570s (or similar spec'd parts) and $599.

We'll see what happens and when it is released. I should be ready to upgrade by then. Of course I'd have to wait for a waterblock for it though.
 
6990 and this might be neck and neck but due to recent improvements in Crossfire, this thing has to be a dual GTX 580 card since we know that the 6990 is a dual Cayman card.
 
Yeah, Crossfire improved so much that the performance difference between the HD 6970 CF and the GTX 580 SLI is almost non existent while it does exist on their single GPU counterparts.
 
But that's not their problem. They design the reference card and 90% of manufacturers copy it because they're lazy. They could just put the DP in, and then if the manufacturers don't want the cost they can leave it out.

Besides Apple and Eyefinity (none say me anything), what is the use for DP?
 
6990 and this might be neck and neck but due to recent improvements in Crossfire, this thing has to be a dual GTX 580 card since we know that the 6990 is a dual Cayman card.

Do you actually know? I thought 6950 and 6970 was a relatively major jump in power consumption and heat production in this series over the 5 series?

That said, it would not be accurate to say, "we know" anything at this point. If you are going by, and I use this term loosely after seeing the FUD campaign AMD spewed this past year, "the official" specs of 6990, then don't count on it.
I think more in line with dual downclocked 6950's for 6990.

As for the "595", I don't think 2x 580s or 570s are possible, but you never can tell what kind of feats both AMD or Nvidia can pull off when they want/need to.
 
Last edited:
Yeah, Crossfire improved so much that the performance difference between the HD 6970 CF and the GTX 580 SLI is almost non existent while it does exist on their single GPU counterparts.

Yes, crossfire max fps improved greatly. But what of minimums? I'd like to see a review that covers that aspect to see if AMD actually improved minimum framerates while using Xfire as well.
 
Do you actually know? I thought 6950 and 6970 was a relatively major jump in power consumption and heat production in this series over the 5 series?

That said, it would not be accurate to say, "we know" anything at this point. If you are going by, and I use this term loosely after seeing the FUD campaign AMD spewed this past year, "the official" specs of 6990, then don't count on it.
I think more in line with dual downclocked 6950's for 6990.

As for the "595", I don't think 2x 580s or 570s are possible, but you never can tell what kind of feats both AMD or Nvidia can pull off when they want/need to.
Keys, I don't want argue semantics here but I am 99.99999999999999999999% sure that the 6990 will be dual Caymans whether they are 6950 or 6970. In fact there is no "6950" almost all of these GPUs are artificially modified to perform lower. So far almost every 6950 card has been successfully unlocked to a 6970.
 
Keys, I don't want argue semantics here but I am 99.99999999999999999999% sure that the 6990 will be dual Caymans whether they are 6950 or 6970. In fact there is no "6950" almost all of these GPUs are artificially modified to perform lower. So far almost every 6950 card has been successfully unlocked to a 6970.

Artificially modified or not, it's still a lower spec'd product. Regardless of unlocking or not which has little to do with what I was talking about.
I can see possibly two "artificially lower spec'd" 6950s being used in the 6990.
 
Artificially modified or not, it's still a lower spec'd product. Regardless of unlocking or not which has little to do with what I was talking about.
I can see possibly two "artificially lower spec'd" 6950s being used in the 6990.

According to AT, the HD6950 uses less power than the HD5870.
The 5970 was two lower clocked HD5870's, so the HD6990 being two straight up HD6950's is plausible, or two lower clocked HD6970s, but not straight up 6970s.
 
According to AT, the HD6950 uses less power than the HD5870.
The 5970 was two lower clocked HD5870's, so the HD6990 being two straight up HD6950's is plausible, or two lower clocked HD6970s, but not straight up 6970s.

So we almost agree. Wow. :thumbsup:
 
Perhaps they feel that the market cap for DisplayPort is too little to justify the added cost.

The same went for Physx. But they added it anyway. And "to me" I would rather have display port. Which I do now.

Hardest thing is figuring out which monitor to go with.
 
The same went for Physx. But they added it anyway. And "to me" I would rather have display port. Which I do now.

Hardest thing is figuring out which monitor to go with.


Comparing apples to oranges eh? (fallacy detected)
But you had nothing to add...nice to know.

Besides LightPeak is...lightyears...ahead of DisplayPort, so why waste time on DisplayPort?
 
Back
Top