Who will make the fastest dual GPU card? (poll included)

Which company will have the fastest dual GPU video card

  • Nvidia

  • ATI

  • Not sure


Results are only viewable after voting.

cbn

Lifer
Mar 27, 2009
12,968
221
106
With ATI's Cypress die up to ~340mm clock speeds had to be cut to 725 Mhz to get under the 300 watt TDP of PCI-E 2.0.

Will Nvidia be able to achieve greater performance by using two larger dies and under-volting? Or will they go with harvested dies and only use the best CUDA cores for power efficiency?

Which company will make the most efficient use of 300 watts?
 
Last edited:

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
Well since it's coming out 6/7 months later, Nvidia needs to come out with something faster cus if they don't then they failed big time.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Nvidia ..thats a easy one
Now who will have the cheaper card? AMD

Um, how is it an easy one?
You do know that AMD are at the limits of PCIe power, right?

Fermi will have the same 300w limit as the HD5970 does. It isn't about who makes a faster single GPU, it's about who can make their GPU work with the lowest power consumption, and early reports indicate that Fermi will be hot.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
http://www.anandtech.com/video/showdoc.aspx?i=3354

According to this article HD4870 1GB was only a 150 watt card and the 4870 x2 was a 250 watt card.

But now with these big ATI cores will Nvidia still win? 295 GTX was a 289 watt TDP card so comparing that to 4870 x2 doesn't really prove Nvidia was more efficient with dual GPU technology.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
The question is when and what? If we get a dual fermi card somewhere in summer before Northern island or at least something like a 5990 (well going back to 58XX isn't an option, right?) appears chances are good that Nvidia will "win", but I'd also bet that Northern Islands is out way earlier than Nvidia's die shrink..
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Um, how is it an easy one?
You do know that AMD are at the limits of PCIe power, right?

Fermi will have the same 300w limit as the HD5970 does. It isn't about who makes a faster single GPU, it's about who can make their GPU work with the lowest power consumption, and early reports indicate that Fermi will be hot.

He said faster not more power efficient.
I believe Nvidia will find a way to make the dual Fermi faster.

If its not faster why release it? I also believe a gtx480 will be 75% as fast as a 5970.

Edit: kinda like the 8800gtx vs 3870x2. The 3870 won but then got smoked by the 9800gx2.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
He said faster not more power efficient.

Well if both companies are very close to 300 watt TDP, then efficiency becomes the determining factor for speed/performance.

Last generation the 295 GTX had almost 40 more watts of TDP compared to 4870x2. This time around Nvidia will not have the same advantage (unless they break PCI-E spec and go dual 8 pin PCI-E connectors in order to bump TDP to 375 watts).
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
He said faster not more power efficient.
I believe Nvidia will find a way to make the dual Fermi faster.

If its not faster why release it? I also believe a gtx480 will be 75% as fast as a 5970.

Edit: kinda like the 8800gtx vs 3870x2. The 3870 won but then got smoked by the 9800gx2.

When you have two companies both at a power wall, it becomes a matter of efficiency.
There is a limit to how much power a PCIe spec card can consume. ATI is at that limit, and they had to downclock their card to stay within it. The 295GTX was "only" two GTX275's, not 2x285's, partly because of similar concerns (289w TDP for the GTX295)
NV will almost certainly exceed the limit with just two unchanged Fermi GPUs, so they will have to downclock/cut them down.

It's not 2xFermi vs 2xCypress, it's 2xGPUs in a 300w power envelope, so it's all about efficiency and nothing else.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
That product has dual eight pin power connectors (for a max TDP of 375 watts).

And? So your saying it was more then 300 watts?
I thought you couldn't do that?

I thought I read Dual Fermi will have 2 8 pin and 1 6 pin connectors also?

Edit: seems the gtx 480 uses 2 6 pin connectors @ 225 watts max.

It says here the gtx 480 uses about the same amount of power as a gtx 285.
http://www.brightsideofnews.com/news/2009/6/8/nvidia-gt300-targets-225w-tdp.aspx

So if they can make a dual gtx 285, why can't they make a dual fermi?

Edit 2 : So say your right and they have to make a dual Fermi cut down a little (kinda like a gtx 295) So you'll have dual gtx 475's on one card?

I still think that will beat a 5970.
 
Last edited:

Jd007

Senior member
Jan 1, 2010
207
0
0
First of all PCI-E 2.0 specs have the TDP maxed out at 300W. Anything above that won't get PCI-E certification.

Secondly, since a single Fermi card is already using pretty close to 300W (the CES demo Fermi card reportedly had a TDP of 280W), I don't see how they can fit two of them in a single card. Even with significantly lower clocks it's hard to squeeze two of them into a 300W envelope.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
First of all PCI-E 2.0 specs have the TDP maxed out at 300W. Anything above that won't get PCI-E certification.

Secondly, since a single Fermi card is already using pretty close to 300W (the CES demo Fermi card reportedly had a TDP of 280W), I don't see how they can fit two of them in a single card. Even with significantly lower clocks it's hard to squeeze two of them into a 300W envelope.

Please read the whole thread .
Don't miss the part on dual gtx 285 and Fermi using as much power as a gtx 285.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Please read the whole thread .
Don't miss the part on dual gtx 285 and Fermi using as much power as a gtx 285.

1. That article was from June 2009

2. It doesn't say Fermi will have the same TDP as 285 GTX, it says:

Article linked by Happy Medium in post #12 said:
The GT300 part targets a thermal range of 225W and should feature two 6-pin PEG [PCI Express Graphics] power connectors, same as on the current GTX285 graphics card."
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
why do they even need pci-e certification? Can't they just make the card use 375w or whatever and sell it? or maybe they'll release a card that is right at 300w but tell all the board partners to just crank up the juice anyway?

niiiiicce, so happy medium you are linking an article from june 2009 to show us what the power usage will be for fermi? Of course, that was probably the most recent article on that subject since nvidia hasn't exactly been forthcoming with info lately.
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
why do they even need pci-e certification? Can't they just make the card use 375w or whatever and sell it? or maybe they'll release a card that is right at 300w but tell all the board partners to just crank up the juice anyway?

My impression was that big players like Dell will balk at non-PCI-e certified parts and that;s a no go. Personally I'd buy one, but I don't think they give a shit about me :)

@OP - I have no earthly idea! I can't even hazard a guess.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
1. That article was from June 2009

2. It doesn't say Fermi will have the same TDP as 285 GTX, it says:

Hmmmm I read this...

There you go. You have a 40nm chip targeting clocks of 700 MHz for the core, 1600 MHz for those 512 MIMD shader cores and nice 1.1 GHz QDR, e.g. 4.4 GT/s for the GDDR5 memory... sucking same amount of power as the actual GTX285. Expect Jen-Hsun and Ujesh to be all over the power features inside the chip, since the chip architects sweated blood over this one

Did I read this wrong?

Edit : oh I see its a old article,my mistake...
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I don't think that dell sells very many 5970's right now, anyway. Alienware might, but is somebody buying a card from alienware going to give a shit if the card is 300w or 350w??? personally I think that it's a stupid spec and needs to be updated. we have lots of people these days buying 1000-1200w psu's, going up even to 400w wouldn't be a big deal to that type of power user.
 

Jd007

Senior member
Jan 1, 2010
207
0
0
Please read the whole thread .
Don't miss the part on dual gtx 285 and Fermi using as much power as a gtx 285.

I did read the whole thread. Like Computer Bottleneck said, that article is quite old and a lot of things about Fermi have changed (according to that article we would've been playing with Fermi cards in Q3 last year, but...).

As for the dual GTX 285 cards, each one has a TDP of 183W, so with some tweaking, I see how they managed to fit it inside the 300W limit. However, single card Fermi's TDP is reportedly 100W higher than that, so I don't see how anybody can make two of them fit on one card.

And as for just cast the PCI-E certification aside and release the card anyway, it may not work well. The specification's max TDP is there for a reason - any card that uses more than 300W is going have a very difficult time staying cool.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I don't think you can claim "World's Fastest _____ (fill in the blank)" if you don't stay within the rules. Also the Asus "Mars" GTX295 was limited production and $1000-$1200.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
I don't think that dell sells very many 5970's right now, anyway. Alienware might, but is somebody buying a card from alienware going to give a shit if the card is 300w or 350w??? personally I think that it's a stupid spec and needs to be updated. we have lots of people these days buying 1000-1200w psu's, going up even to 400w wouldn't be a big deal to that type of power user.

Alienware is DELL so of course they care.
 

Jd007

Senior member
Jan 1, 2010
207
0
0
I personally think that a dual-GPU Fermi card in 2010 is a lost cause. So the question then becomes "can a single GTX 480 beat the ATI 5970?". I think it could come close to it, maybe on par or even beat it by a little in some cases, but not likely that it can definitely claim to the "fastest graphics card crown".