Who will make the fastest dual GPU card? (poll included)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Which company will have the fastest dual GPU video card

  • Nvidia

  • ATI

  • Not sure


Results are only viewable after voting.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
sorry, I know that dell owns them, I just meant that DELL would buy them if they were the fastest but would put them in an alienware box.

alienware would have to hire half of the nvidia marketing team to convince people to pay their prices for "pretty fast" or "one of the fastest" cards. if nvidia makes a 375w card that kicks the crap out of 5970 then they'll sell as many as they can make.

@jd007: perhaps you are right and that is the reason for the delay, though I doubt it. if they were significantly faster than 5870 on a single gpu with any kind of volume then they would push that out right away, then tweak it/hand pick cards for higher model #'s like 8800gts512+++++.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
sorry, I know that dell owns them, I just meant that DELL would buy them if they were the fastest but would put them in an alienware box.

DELL will not touch any out of spec cards if only for legal reasons. Really in the end DELL don't care about selling a couple hundred ultra expensive video cards compared to possible legal or brand damaging repercussions from having an out of spec hardware causing something bad.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I did read the whole thread. Like Computer Bottleneck said, that article is quite old and a lot of things about Fermi have changed (according to that article we would've been playing with Fermi cards in Q3 last year, but...).

As for the dual GTX 285 cards, each one has a TDP of 183W, so with some tweaking, I see how they managed to fit it inside the 300W limit. However, single card Fermi's TDP is reportedly 100W higher than that, so I don't see how anybody can make two of them fit on one card.

And as for just cast the PCI-E certification aside and release the card anyway, it may not work well. The specification's max TDP is there for a reason - any card that uses more than 300W is going have a very difficult time staying cool.

So they can somehow shave 66 watts off a dual gtx 285 with 4gb of memory , but they can't shave it off the 1.5gb Fermi without making it slower then the 5790.

A gtx 480 is going to be around 225 watts and the gtx 470 should be less.
So if a gtx 470 beats a 5870 which it should ,a dual Fermi gtx 470 should beat a downclocked 5790.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I've seen a lot of people throw out this "well, everybody knows that you can't do it" line, but does anybody here actually work for dell/hp/etc, esp in their legal dept? I don't see why you couldn't just put a disclaimer on the card and/or box and just move on. Also, if you really think that they would only sell a "couple hundred" then nvidia wouldn't need to worry about them too much, would they?

I don't have access to alienware sales figures, but if dell doesn't let them call their own shots then I suspect that they'll eventually be eclipsed by more nimble competitors that can make common-sense decisions instead of lawyer-approved ones. Look at it this way: for every alienware out there, there are dozens of well known and thousands of relatively unknown companies out there that WOULD use an out of spec card just because it was fast as shit.

edit: @happy medium: why do you think a gtx 470 "should" beat a 5870? Yes nvidia has a history of going for the top, but the evidence has been piling up over the past 7 months that something is amiss at the circle k. nvidia spent too much effort on a very small hpc segment and not enough on the process this time around. they will be lucky to get a gtx 480 that beats 5870, at least in any sort of reasonable volume. even if they do they will still be 2nd fastest unless they somehow manage to throw out a dual gpu part before 6xxx shows up.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I don't have access to alienware sales figures, but if dell doesn't let them call their own shots then I suspect that they'll eventually be eclipsed by more nimble competitors that can make common-sense decisions instead of lawyer-approved ones. Look at it this way: for every alienware out there, there are dozens of well known and thousands of relatively unknown companies out there that WOULD use an out of spec card just because it was fast as shit.

At some point a person needs to ask themselves, "why not use a special mainboard and run four single GPU Fermis? Maybe that is a cheaper way to go?"
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
Well as an accountant I can tell you that DELL really won't care that you want an out of spec card. They are not going to start considering these things because the possible negative impacts from it are orders of magnitude greater than the potential gain.

Then you need to consider that if it was as easy as "just sell the > 300W TDP card" that ATI would be doing it right now. There are reasons why they bothered downclocking their card.

Anyways think about it rationally. You need to fit the hardware from 2 225W cards on a single card and still be under 300W. Even if you can stretch it to 350W its going to be pushing it. Now of course you can clock it slower to save wattage so you can make it work. But since dual GPU does not scale anywhere near 100% you would probably be better off with the single chip card than with a dual 70% clocked card(best case scenario).
 

Jd007

Senior member
Jan 1, 2010
207
0
0
So they can somehow shave 66 watts off a dual gtx 285 with 4gb of memory , but they can't shave it off the 1.5gb Fermi without making it slower then the 5790.

A gtx 480 is going to be around 225 watts and the gtx 470 should be less.
So if a gtx 470 beats a 5870 which it should ,a dual Fermi gtx 470 should beat a downclocked 5790.

Where did you get the info that the GTX 480 is going to be around 225W? The last I heard, the Fermi sample shown at CES last month had a TDP of around 280W. Also the amount of memory on a card doesn't really affect power consumption much. So we are talking about shaving off ~200W for the dual-GPU Fermi instead of 66W.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Where did you get the info that the GTX 480 is going to be around 225W? The last I heard, the Fermi sample shown at CES last month had a TDP of around 280W. Also the amount of memory on a card doesn't really affect power consumption much. So we are talking about shaving off ~200W for the dual-GPU Fermi instead of 66W.

A dual card is not double the power of 2 single cards.
Mabe the old sandwhich models.

It's 1 PCB with 2 cores.
A gtx 295 used 2 gtx 275's, but only used 100 watts more then a single 275.
Do you see where I'm getting at? Thats ~ 80 watts less.

I can't find the link to the 225watt Fermi.
I know nothing about the Fermi SAMPLE model being 280 watts.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
That product has dual eight pin power connectors (for a max TDP of 375 watts).

Where does it say the card uses 375watts? It's possible for the card to use 375watts through the connectors. I thought the 2 8 pin connectors were for the voltage tweak option. Extra 75 watts for overclocking with the voltage tweak.:D
 

Jd007

Senior member
Jan 1, 2010
207
0
0
A dual card is not double the power of 2 single cards.
Mabe the old sandwhich models.

It's 1 PCB with 2 cores.
A gtx 295 used 2 gtx 275's, but only used 100 watts more then a single 275.
Do you see where I'm getting at? Thats ~ 80 watts less.

I can't find the link to the 225watt Fermi.
I know nothing about the Fermi SAMPLE model being 280 watts.

The Fermi card sample at CES had a 6 + 8 pin power config, which gives a theoretical TDP of 300W. And you are right, dual-GPU power consumption isn't strictly 2x a single card. But still, I think that if they down-clocked Fermi enough to have two chips on one board and draws only 300W, then the performance would probably be dropped significantly as well.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Where does it say the card uses 375watts? It's possible for the card to use 375watts through the connectors. I thought the 2 8 pin connectors were for the voltage tweak option. Extra 75 watts for overclocking with the voltage tweak.:D

I don't know how many watts it uses. I just pointed out the dual 8 pins to make light of the fact it is an out-of-spec Video card model.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The Fermi card sample at CES had a 6 + 8 pin power config, which gives a theoretical TDP of 300W. And you are right, dual-GPU power consumption isn't strictly 2x a single card. But still, I think that if they down-clocked Fermi enough to have two chips on one board and draws only 300W, then the performance would probably be dropped significantly as well.

We shall see my friend.. Hopefully on Monday with that so called SPECIAL announcement from Nvidia.

If a gtx 470 is around 200 watts, they should be ok for a dual Fermi gtx 470.
Thats gonna be enough in my opinion to overtake the 5790.

Edit: I just want the freaking 5830 to be released so I can upgrade!
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The Fermi card sample at CES had a 6 + 8 pin power config, which gives a theoretical TDP of 300W. And you are right, dual-GPU power consumption isn't strictly 2x a single card. But still, I think that if they down-clocked Fermi enough to have two chips on one board and draws only 300W, then the performance would probably be dropped significantly as well.

http://www.nvidia.com/docs/IO/43395/BD-04983-001_v01.pdf

I found this older document with specs for a 6 and 8 pin. This would put TDP somewhere between 225 watts and 300 watts.

Cypress is 188 watts. So maybe Fermi (being larger) is somewhere around 225 watts?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
http://www.nvidia.com/docs/IO/43395/BD-04983-001_v01.pdf

I found this older document with specs for a 6 and 8 pin. This would put TDP somewhere between 225 watts and 300 watts.

Cypress is 188 watts. So maybe Fermi (being larger) is somewhere around 225 watts?

Lets just say its 230 watts. Thats a gtx480.

Think they can shave off ~ 30/40 watts to make the gtx 470?
That brings it under 200 watts for one gtx 470.
Now the question is can they add another gtx 470 core on a larger PCB and only add 100 watts more?
I think they can. They will find a way.
That Nvidia president has too much ego to let AMD take the title.:D
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I found the article where I read 225 watts.

Quote:

Power demands are really interesting: as we reported months ago, NV100/GT300 boards targeted 225W TDP and that target has been achieved. The boards pack 8-pin [150W] and 6-pin [75W] PEG [PCI Express Graphics] connectors but as expected, you should either use a single 8-pin or dual 6-pin to power the card. While this will be true for Quadro and Tesla-based parts, we expect that nVidia's partners will find a way to use both 8-pin and 6-pin power connectors to enable overclocking. Given the achieved performance on these OEM-spec-limited parts, we would not be surprised to see significantly overclocked consumer cards that as usual, will launch prior to commercial lineup


http://www.brightsideofnews.com/new...mi-is-less-powerful-than-geforce-gtx-285.aspx
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I found the article where I read 225 watts.

Quote:

Power demands are really interesting: as we reported months ago, NV100/GT300 boards targeted 225W TDP and that target has been achieved. The boards pack 8-pin [150W] and 6-pin [75W] PEG [PCI Express Graphics] connectors but as expected, you should either use a single 8-pin or dual 6-pin to power the card. While this will be true for Quadro and Tesla-based parts, we expect that nVidia's partners will find a way to use both 8-pin and 6-pin power connectors to enable overclocking. Given the achieved performance on these OEM-spec-limited parts, we would not be surprised to see significantly overclocked consumer cards that as usual, will launch prior to commercial lineup


http://www.brightsideofnews.com/new...mi-is-less-powerful-than-geforce-gtx-285.aspx

I would be amazed if anyone is willing (from a legal perspective) to sell cards that are out of spec, unless it is coupled with a 'certified' case/psu/mobo package that will manage that legal liability issue.

You may see cards from Nvidia like the 5970, which comply with spec but have the potential to be overclocked beyond it, but I just can't see anyone being willing to take the significant legal risks associated with selling a card that is out of spec. You are just begging to end up on the wrong end of a lawsuit ranging all the way from a class action over damaged components all the way to someone's house/business burning down and/or someone being killed...and remember that at least in Australia company directors can have personal civil and criminal (in some circumstances) liability for the actions of the company they are responsible for...

The only way I can see it happening is if the spec is revised, or as part of a 'certified' case and psu package.

Just my 2c at the end of the day ;)
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Efficiency will be determined by the biggest chip with the most horsepower per clock. Power consumption is a function of voltage squared, but only scales linearly with the total number of transistors. So, with more processing blocks available, the clock speed can be scaled back accordingly; the operating voltage can then be reduced, which in turn results in lower total power consumption.

The best bet given the constraints of the current generation process technology and total power envelope is to put four or more processors on a board and super-under-volt them all. Granted that would cost a ton so it probably won't happen.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
NVidia can't even produce a single GPU to beat Cypress.....why in hell would you expect them to suddenly conjure up something that can beat Hemlock.:rolleyes:
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Speaking of Performance, do you guys see the same thing I am seeing.

Every generation the Video card makers have been trying to double performance. In the process TDPs have continued to climb.

4870 has a TDP of 150 watts
5870 has TDP of 188 watts

285 GTX has a TDP of 185 watts
Fermi has a TDP of ?????

Does anyone else see a big possible slowdown coming soon? Will ATI be the first to reduce its doubling efforts and instead focus on synchronizing GPUs for the most cost effective or energy efficient use of the maximum PCI-E 2.0 spec?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Here's the answer:

AMD will be faster, then Nvidia will be faster, then AMD will be faster, then Nvidia will be faster, and so on, and so fourth.