EVGA offers a sneak peek at Nvidia's next dual-GPU monster

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrHydes

Junior Member
Jan 16, 2011
7
0
0
EVGA_Dual-GPU_Fermi_01.jpg


the answer it's easy if you look at the card will see that it has 4 clusters

of vram per GPU, so it's Easy... do not need anymore to play sherlock

4x 256k = 1024MB So there you go... :colbert:
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The speculation on Nvidia's dual part has died down in this thread. We should create a friendly bet/poll on what people think will end up forming the x2 GPU part. I personally think it's going to be a hybrid sort of GF110 - it will have all 512 shaders like the gtx580 on each chip, but it will be downclocked to around gtx570 speeds. Further, Nvidia may also opt to cut the memory bus down to 320-bits to help save a little bit more on both keeping it within the 300w thermal envelope AND keeping the price down with slightly less ram.

Did you hack my account or do we think exactly alike?:)
 

Zerocarestate

Junior Member
Jan 16, 2011
4
0
0
The speculation on Nvidia's dual part has died down in this thread. We should create a friendly bet/poll on what people think will end up forming the x2 GPU part. I personally think it's going to be a hybrid sort of GF110 - it will have all 512 shaders like the gtx580 on each chip, but it will be downclocked to around gtx570 speeds. Further, Nvidia may also opt to cut the memory bus down to 320-bits to help save a little bit more on both keeping it within the 300w thermal envelope AND keeping the price down with slightly less ram.


Have to totally agree on that one. Remember that the Gtx295 was not a dual 280 or 285, rather it was two of the yet to be released 275s. I would be happy with dual 570s and a little less ram. It wouldn't be a far stretch to go from 1280mg to 1024mg to save on cost. Then we may see the card split later as the gtx 565.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Have to totally agree on that one. Remember that the Gtx295 was not a dual 280 or 285, rather it was two of the yet to be released 275s. I would be happy with dual 570s and a little less ram. It wouldn't be a far stretch to go from 1280mg to 1024mg to save on cost. Then we may see the card split later as the gtx 565.

A card like that will need 5gb of total memory,2.25gb usable.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
If I had any interset in multi-GPU, my only consern would be performance.
Wattage on a dual-GPU is even more retarded than on a single GPU...it's like whining about the MPG on a Veron (dual V8)...


No, it's not really like that at all.

It's like complaining about a sports car that gets worse gas mileage while being slower in most metrics than a competing car.

You say you are all about performance, than why do you discard multi-GPU setups? There are times when multi-GPU won't work as it's supposed to, so you'll only get single GPU performance. But far more often than not, you'll be faster. A 5970 is/was faster than a GTX480 much more often than it wasn't (and while managing to actually use a little less power!)

It's fine that you like single GPU's, but I don't see how you can claim to only be about performance when that appears not to be the case... your comparrison to cars is flawed.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Russain said it best.

Are we really gonna cry about gpu wattage and overclock are cpu's over 200 watts.
Seems kinda dum ha?
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
I didn't see it mentioned (could very well be blind) but has anyone said anything about a release date for this eVGA card? What about the announced 6990 or the un-announced official nVidia dual?

I'm pretty happy with my 6970 right now but, for the first time ever, I'm considering going xfire or getting a dual GPU card. Why? No idea. Just seems like fun :)
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
No, it's not really like that at all.

It's like complaining about a sports car that gets worse gas mileage while being slower in most metrics than a competing car.

You say you are all about performance, than why do you discard multi-GPU setups? There are times when multi-GPU won't work as it's supposed to, so you'll only get single GPU performance. But far more often than not, you'll be faster. A 5970 is/was faster than a GTX480 much more often than it wasn't (and while managing to actually use a little less power!)

It's fine that you like single GPU's, but I don't see how you can claim to only be about performance when that appears not to be the case... your comparrison to cars is flawed.


That is why.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Russain said it best.

Are we really gonna cry about gpu wattage and overclock are cpu's over 200 watts.
Seems kinda dum ha?

I completely agree that power use gets blown out of proportion, but that doesn't mean it doesn't have some importance. Again, I don't think the issue is ultimate power use, but when your part is slower and uses more power you'll see it brought up. The 2900XT was slower than the competing GeForce parts, and was more power hungry. That combined with it showing up quite late made it look pretty unimpressive.

By the way, I finally got a Kill-a-Watt... my entire system running Furmark uses little more power than a GTX480 alone running Furmark. ;)


That is why.

But you claim to only care about performance. So you'd rather have a card that is slower 80% of the time than a dual GPU/card set up? I guess I don't see the advantage when all you want is the most performance... you're gonig to be slower more often than not.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Did you hack my account or do we think exactly alike?:)

Hahahaha I could have kept going with my theories but I stopped just short. I don't think Nvidia can release two gtx570's downclocked and expect to have the hands down fastest dual-GPU solution. And they won't release a dual-card solution unless it's faster than AMD's dual card. So they're either going to break the 300w TDP specification, or they're going to release a 512 core part @ gtx570 speeds and memory bus.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Hahahaha I could have kept going with my theories but I stopped just short. I don't think Nvidia can release two gtx570's downclocked and expect to have the hands down fastest dual-GPU solution. And they won't release a dual-card solution unless it's faster than AMD's dual card. So they're either going to break the 300w TDP specification, or they're going to release a 512 core part @ gtx570 speeds and memory bus.
Pretty much this.
 

Zerocarestate

Junior Member
Jan 16, 2011
4
0
0
Welcome to the forums Zerocarestate! :thumbsup:

Thank you. I usually don't participate in forums but enjoy good conversations on upcoming tech.

As for a release date I have only the hint from Evga as they stated as CES that it would be a few months.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
But you claim to only care about performance. So you'd rather have a card that is slower 80% of the time than a dual GPU/card set up? I guess I don't see the advantage when all you want is the most performance... you're gonig to be slower more often than not.

And affected by the other irks of multi GPU, no thanks.
No cookie for you.


(I still run CRT's because I can't stand the I.Q. of LCD's, don't "advice" me to degrade my visual experience).
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
And affected by the other irks of multi GPU, no thanks.
No cookie for you.


(I still run CRT's because I can't stand the I.Q. of LCD's, don't "advice" me to degrade my visual experience).

So how's that 24inch lightweight,slimline and easily adjustable CRT working out for you then?
I'll bet games look great on it....
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
People still run CRT's? Uhh
if I had a larger one I would probably still be using it. dark games look like poop and any lcd I have ever used. in fact I think I will be pulling the crt out of the closet to play Amnesia.
 

Castiel

Golden Member
Dec 31, 2010
1,772
1
0
if I had a larger one I would probably still be using it. dark games look like poop and any lcd I have ever used. in fact I think I will be pulling the crt out of the closet to play Amnesia.

Years ago i had a 24" trinitron sony crt and it was badass. Only bad part is that it weighed like a hundred pounds. Best part was i only paid 50 bucks for it.

http://www.amazon.com/Sony-GDM-FW900.../dp/B00004YNSR
 

Jionix

Senior member
Jan 12, 2011
238
0
0
Considering the fact that Nvidia has been teasing a dual-card for the greater part 2010, I would be somewhat shocked if a Dual-Fermi ever sees the light of day.

Now, let's say it does happen (and assuming it's in quantities that could be considered an actual product release and not say, 10 cards), I doubt it would be based on the 570/580.

While Nvidia have gotten their thermals and TDP a little more under control, they are still not where they need to be for their top product.
 

Castiel

Golden Member
Dec 31, 2010
1,772
1
0
Considering the fact that Nvidia has been teasing a dual-card for the greater part 2010, I would be somewhat shocked if a Dual-Fermi ever sees the light of day.

Now, let's say it does happen (and assuming it's in quantities that could be considered an actual product release and not say, 10 cards), I doubt it would be based on the 570/580.

While Nvidia have gotten their thermals and TDP a little more under control, they are still not where they need to be for their top product.

We've already discussed a pair of 580's on a single PCB could happen with a 600 core and undervolted to hell.
 

Jionix

Senior member
Jan 12, 2011
238
0
0
But the question is, will Nvidia actually do something like that? Sometimes, it's not about if it's possible, but if it is business feasible.

You are going to take chips that are normally selling in a certain market at a certain performance range and castrate them, and spend the resources to under-volt them.. for what exactly...?
 

Castiel

Golden Member
Dec 31, 2010
1,772
1
0
But the question is, will Nvidia actually do something like that? Sometimes, it's not about if it's possible, but if it is business feasible.

You are going to take chips that are normally selling in a certain market at a certain performance range and castrate them, and spend the resources to under-volt them.. for what exactly...?

Everyone knows that the first thing people would do when they bought a low clocked 595 would be to turn up the clocks and voltage. Setting them at 600 core and low volts would just be a way to keep the heat and power usage down.