5870 X2 TDP is 376W

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0

"We've heard that ATI is working on an X2 version of Radeon HD 5870 card but the biggest obstacle is the power and how to cool such a card.

We?ve learned that the current TDP for X2 is 376W and that the company is working on this issue, as apparently they will have to slow down the GPUs down by quite a lot to get rid of the heat.

Even if they use downclocked Radeon 5850 cores that run at 725MHz, the power goes down by only 36W (2x170W) to 340W. The hottest card from ATI so far was HD 4870 X2 that had TDP of 286W. To release a Radeon HD 5870 X2 card ATI should go down at least to 300W, especially due to thermal issues, but who knows, maybe ATI will launch 300W+ card. source


Yea it's Fud, but I can't see ATI lowering gpu clocks, not over using better cooling. That would be stupid.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
376 = 188W x 2. That's way too convenient :). I seriously doubt AMD made a combined card and didn't save a single watt in power consumption.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Wouldn't it be pointless to lower the clocks and cut down on the core to something lower than a HD5850? Since that would make it around the speed of a GTX285... Double that and you get something a bit faster than a GTX295... Which makes it only a bit faster than a HD5870 too - all this effort to get a card a bit faster than that is kinda pointless imo. Either they figure out how to slap two RV870 cores together in that card or they might as well not bother at all... at least that's what I think.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I thought their whole purpose/plan/strategy was this multi-GPU thing. Didnt they see this coming?

But, it could all be FUD, just like all the other stuff circulating right now.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: MrK6
376 = 188W x 2. That's way too convenient :). I seriously doubt AMD made a combined card and didn't save a single watt in power consumption.

Maybe they used two PCBs. Oh noes, not teh ATI sammich card!
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: nitromullet
Originally posted by: MrK6
376 = 188W x 2. That's way too convenient :). I seriously doubt AMD made a combined card and didn't save a single watt in power consumption.

Maybe they used two PCBs. Oh noes, not teh ATI sammich card!

well the link I showed to the 4870x2 using twice the power as the 4870 should indicate the same is likely for the 5870x2 compared to the 5870. also two pcbs makes little difference as shown by the two versions of gtx295.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
This would be difficult for air-cooling I imagine but would this kind of thermal dissipation pose any issue for the capabilities of water-cooling?
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Coming from someone with a 2X setup, I find it hard to believe.

The 2nd card in a multi-gpu is not nearly as stressed as the card in the primary slot, meaning it will not draw the same amount of power.

In a X3 setup, you can assume the max power draw would be even less.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: OCguy
Coming from someone with a 2X setup, I find it hard to believe.

The 2nd card in a multi-gpu is not nearly as stressed as the card in the primary slot, meaning it will not draw the same amount of power.

In a X3 setup, you can assume the max power draw would be even less.

again look at the results from xbit which clearly show a 4870x2 uses double the power of the 4870 under full load. thats not a theory but is real world results. plus TDP is usually much higher than worse case actual wattage so the TDP of the 5870x2 being about twice as high as the 5870 makes perfect sense.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
hmmm...I think that's the highest rumored TDP I've ever seen. lol Is it possible to go to a triple slot card? Nobody really needs quad-fire anyway. lol
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
When we look at nV's single PCB design, we see copper base and aluminium fin heatsinks which do the job, with a fairly small fin surface area per sink.

I think with a little innovation, ATI can pull it off. Fin area could be increased, fan could be relocated and not underpowered, etc. Perhaps the traditional X2 deployment of all copper heatsinks with fan to one side is not the way to go this time? Come on ATI, focus on the cooler..
 

WelshBloke

Lifer
Jan 12, 2005
33,512
11,647
136
Originally posted by: Hauk
When we look at nV's single PCB design, we see copper base and aluminium fin heatsinks which do the job, with a fairly small fin surface area per sink.

I think with a little innovation, ATI can pull it off. Fin area could be increased, fan could be relocated and not underpowered, etc. Perhaps the traditional X2 deployment of all copper heatsinks with fan to one side is not the way to go this time? Come on ATI, focus on the cooler..

If this rumor is true they don't need to concentrate on the cooling they need to get the wattage down.

376W is totally unacceptable for one component of a consumer PC.

 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: WelshBloke
Originally posted by: Hauk
When we look at nV's single PCB design, we see copper base and aluminium fin heatsinks which do the job, with a fairly small fin surface area per sink.

I think with a little innovation, ATI can pull it off. Fin area could be increased, fan could be relocated and not underpowered, etc. Perhaps the traditional X2 deployment of all copper heatsinks with fan to one side is not the way to go this time? Come on ATI, focus on the cooler..

If this rumor is true they don't need to concentrate on the cooling they need to get the wattage down.

376W is totally unacceptable for one component of a consumer PC.

For the majority, agreed. I'd take on a 376w beastie though :D

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
4870 TDP = 160w, avg load = 130w, 4870X2 TDP = 284w, avg load = 263w. 5870 TDP = 188w, avg load ~150w, 5870X2 TDP = ~300w-340w, avg load is going to be ~300w. If they want it to be certfied pci-e on 6pin + 8pin, it has to be less than or equal to 300w pull. You can already make a 4870x2 spike to ~380 watts in Furmark.

Fud.
 

Jacen

Member
Feb 21, 2009
177
0
0
I've been told through the grapevine that it is 300W max load, not 376W. Not going to call Faud's bluff quite yet but he is usually only right like half of the time at best.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Jacen
I've been told through the grapevine that it is 300W max load, not 376W. Not going to call Faud's bluff quite yet but he is usually only right like half of the time at best.

well then max load would have to be around 150watts on 5870 if true

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: toyota
Originally posted by: Jacen
I've been told through the grapevine that it is 300W max load, not 376W. Not going to call Faud's bluff quite yet but he is usually only right like half of the time at best.

well then max load would have to be around 150watts on 5870 if true

Just average load wattage would have to be around 150. A 4870 can pull as much as 210 watts by itself, and they had no trouble X2'ing it and average 260w load...

http://tpucdn.com/reviews/MSI/...ages/power_maximum.gif

http://www.xbitlabs.com/images...sfire/4870cf_power.gif
 

Forumpanda

Member
Apr 8, 2009
181
0
0
Originally posted by: toyota
Originally posted by: OCguy
Coming from someone with a 2X setup, I find it hard to believe.

The 2nd card in a multi-gpu is not nearly as stressed as the card in the primary slot, meaning it will not draw the same amount of power.

In a X3 setup, you can assume the max power draw would be even less.

again look at the results from xbit which clearly show a 4870x2 uses double the power of the 4870 under full load. thats not a theory but is real world results. plus TDP is usually much higher than worse case actual wattage so the TDP of the 5870x2 being about twice as high as the 5870 makes perfect sense.
But the problem with such a 'test' is that it doesn't test average load in games.
peak load in some specific test or Furmark is a poor indicator for actual real world load.

Be it a CPU or GPU just because you can run some specialized code through it, that forces a certain peak watt number out of it, does not mean that the number is really the amount of power the component would draw in real world scenarios.

For example, it could be possible that ATI will go the Intel route and simply cap the peak load. Show me a test measuring average gaming load and I'll be convinced.


It is also a shame all power consumption 'measurements' are apples to oranges.
Lets say we have card A and card B, if card A is twice as fast as card B but uses less power to draw each frame (each equal task of work), then when both cards are limited to 60fps (apples to apples), card A would use less power.
However all 'power consumption tests' focus on making the card do the maximum amount of work, thus they would find that card A uses more power, hardly a surprise that when a card does twice the work it uses more power than a card that does less work.

While the number they come up with have some use, it is largely not as relevant as the other number, for GPUs think playing games that do not max out the card, or power/point ratio for stuff like F@H.

Likewise the same problem showed in all i7 vs. PII reviews, power consumption while encoding a movie and finding that the PII uses less power (when loaded) is hardly a very interesting result if it also takes twice as long, because during the entire task it ends up consuming more power (even when idle power for the i7 is included).

So my hope is that AT and other review sites will eventually realize this and give me a power consumption number I can actually use, instead of making me do 30 minutes of math to figure out which is best.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: Jacen
I've been told through the grapevine that it is 300W max load, not 376W. Not going to call Faud's bluff quite yet but he is usually only right like half of the time at best.

Yea that seems more in line. Fud prolly just multipled 180w to come up with a story. Good topic though. Can't wait to see the X2 part..

 

Puffnstuff

Lifer
Mar 9, 2005
16,256
4,930
136
Well if heat is a problem they could always use peltiers with an external power source so it would not draw from the pc's ps.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
You people who say the power consumption of this card is unexceptable need a reality check. look at the work this thing can put out. what the hell do you want? It's llke asking a v10 viper to not consume alot of gas when you put the pedal to the metal.

I for one DO NOT CARE if the power consumption is a little high, as long as it is within reason for the performance I'm getting. People don't balk at buying 850-1000 watt power supplies. hmmm, why? Probably because they are expecting to buy POWER HUNGRY hardware.

If you are buying an hd5870 x2 or whatever this thing will be called I'm sure it isn't going in an netbook grade pc,
 

theAnimal

Diamond Member
Mar 18, 2003
3,828
23
76
Originally posted by: Tempered81
Just average load wattage would have to be around 150. A 4870 can pull as much as 210 watts by itself, and they had no trouble X2'ing it and average 260w load...

The highest figure for 4870 is 187 running Furmark from here.