• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

EVGA offers a sneak peek at Nvidia's next dual-GPU monster

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
EVGA_Dual-GPU_Fermi_01.jpg


the answer it's easy if you look at the card will see that it has 4 clusters

of vram per GPU, so it's Easy... do not need anymore to play sherlock

4x 256k = 1024MB So there you go... :colbert:
 
Last edited:
The speculation on Nvidia's dual part has died down in this thread. We should create a friendly bet/poll on what people think will end up forming the x2 GPU part. I personally think it's going to be a hybrid sort of GF110 - it will have all 512 shaders like the gtx580 on each chip, but it will be downclocked to around gtx570 speeds. Further, Nvidia may also opt to cut the memory bus down to 320-bits to help save a little bit more on both keeping it within the 300w thermal envelope AND keeping the price down with slightly less ram.

Did you hack my account or do we think exactly alike?🙂
 
The speculation on Nvidia's dual part has died down in this thread. We should create a friendly bet/poll on what people think will end up forming the x2 GPU part. I personally think it's going to be a hybrid sort of GF110 - it will have all 512 shaders like the gtx580 on each chip, but it will be downclocked to around gtx570 speeds. Further, Nvidia may also opt to cut the memory bus down to 320-bits to help save a little bit more on both keeping it within the 300w thermal envelope AND keeping the price down with slightly less ram.


Have to totally agree on that one. Remember that the Gtx295 was not a dual 280 or 285, rather it was two of the yet to be released 275s. I would be happy with dual 570s and a little less ram. It wouldn't be a far stretch to go from 1280mg to 1024mg to save on cost. Then we may see the card split later as the gtx 565.
 
Have to totally agree on that one. Remember that the Gtx295 was not a dual 280 or 285, rather it was two of the yet to be released 275s. I would be happy with dual 570s and a little less ram. It wouldn't be a far stretch to go from 1280mg to 1024mg to save on cost. Then we may see the card split later as the gtx 565.

A card like that will need 5gb of total memory,2.25gb usable.
 
If I had any interset in multi-GPU, my only consern would be performance.
Wattage on a dual-GPU is even more retarded than on a single GPU...it's like whining about the MPG on a Veron (dual V8)...


No, it's not really like that at all.

It's like complaining about a sports car that gets worse gas mileage while being slower in most metrics than a competing car.

You say you are all about performance, than why do you discard multi-GPU setups? There are times when multi-GPU won't work as it's supposed to, so you'll only get single GPU performance. But far more often than not, you'll be faster. A 5970 is/was faster than a GTX480 much more often than it wasn't (and while managing to actually use a little less power!)

It's fine that you like single GPU's, but I don't see how you can claim to only be about performance when that appears not to be the case... your comparrison to cars is flawed.
 
I didn't see it mentioned (could very well be blind) but has anyone said anything about a release date for this eVGA card? What about the announced 6990 or the un-announced official nVidia dual?

I'm pretty happy with my 6970 right now but, for the first time ever, I'm considering going xfire or getting a dual GPU card. Why? No idea. Just seems like fun 🙂
 
No, it's not really like that at all.

It's like complaining about a sports car that gets worse gas mileage while being slower in most metrics than a competing car.

You say you are all about performance, than why do you discard multi-GPU setups? There are times when multi-GPU won't work as it's supposed to, so you'll only get single GPU performance. But far more often than not, you'll be faster. A 5970 is/was faster than a GTX480 much more often than it wasn't (and while managing to actually use a little less power!)

It's fine that you like single GPU's, but I don't see how you can claim to only be about performance when that appears not to be the case... your comparrison to cars is flawed.


That is why.
 
Russain said it best.

Are we really gonna cry about gpu wattage and overclock are cpu's over 200 watts.
Seems kinda dum ha?

I completely agree that power use gets blown out of proportion, but that doesn't mean it doesn't have some importance. Again, I don't think the issue is ultimate power use, but when your part is slower and uses more power you'll see it brought up. The 2900XT was slower than the competing GeForce parts, and was more power hungry. That combined with it showing up quite late made it look pretty unimpressive.

By the way, I finally got a Kill-a-Watt... my entire system running Furmark uses little more power than a GTX480 alone running Furmark. 😉


That is why.

But you claim to only care about performance. So you'd rather have a card that is slower 80% of the time than a dual GPU/card set up? I guess I don't see the advantage when all you want is the most performance... you're gonig to be slower more often than not.
 
Did you hack my account or do we think exactly alike?🙂

Hahahaha I could have kept going with my theories but I stopped just short. I don't think Nvidia can release two gtx570's downclocked and expect to have the hands down fastest dual-GPU solution. And they won't release a dual-card solution unless it's faster than AMD's dual card. So they're either going to break the 300w TDP specification, or they're going to release a 512 core part @ gtx570 speeds and memory bus.
 
Hahahaha I could have kept going with my theories but I stopped just short. I don't think Nvidia can release two gtx570's downclocked and expect to have the hands down fastest dual-GPU solution. And they won't release a dual-card solution unless it's faster than AMD's dual card. So they're either going to break the 300w TDP specification, or they're going to release a 512 core part @ gtx570 speeds and memory bus.
Pretty much this.
 
But you claim to only care about performance. So you'd rather have a card that is slower 80% of the time than a dual GPU/card set up? I guess I don't see the advantage when all you want is the most performance... you're gonig to be slower more often than not.

And affected by the other irks of multi GPU, no thanks.
No cookie for you.


(I still run CRT's because I can't stand the I.Q. of LCD's, don't "advice" me to degrade my visual experience).
 
And affected by the other irks of multi GPU, no thanks.
No cookie for you.


(I still run CRT's because I can't stand the I.Q. of LCD's, don't "advice" me to degrade my visual experience).

So how's that 24inch lightweight,slimline and easily adjustable CRT working out for you then?
I'll bet games look great on it....
 
Considering the fact that Nvidia has been teasing a dual-card for the greater part 2010, I would be somewhat shocked if a Dual-Fermi ever sees the light of day.

Now, let's say it does happen (and assuming it's in quantities that could be considered an actual product release and not say, 10 cards), I doubt it would be based on the 570/580.

While Nvidia have gotten their thermals and TDP a little more under control, they are still not where they need to be for their top product.
 
Considering the fact that Nvidia has been teasing a dual-card for the greater part 2010, I would be somewhat shocked if a Dual-Fermi ever sees the light of day.

Now, let's say it does happen (and assuming it's in quantities that could be considered an actual product release and not say, 10 cards), I doubt it would be based on the 570/580.

While Nvidia have gotten their thermals and TDP a little more under control, they are still not where they need to be for their top product.

We've already discussed a pair of 580's on a single PCB could happen with a 600 core and undervolted to hell.
 
But the question is, will Nvidia actually do something like that? Sometimes, it's not about if it's possible, but if it is business feasible.

You are going to take chips that are normally selling in a certain market at a certain performance range and castrate them, and spend the resources to under-volt them.. for what exactly...?
 
But the question is, will Nvidia actually do something like that? Sometimes, it's not about if it's possible, but if it is business feasible.

You are going to take chips that are normally selling in a certain market at a certain performance range and castrate them, and spend the resources to under-volt them.. for what exactly...?

Everyone knows that the first thing people would do when they bought a low clocked 595 would be to turn up the clocks and voltage. Setting them at 600 core and low volts would just be a way to keep the heat and power usage down.
 
Back
Top