Antilles dual card packs two Caymans

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
The thing is Nvidia are quite... loose with their TDP listings:

Nvidia says 480 uses 250watts TDP, but really its 320watts.
Amd says 6870 uses 151watts TDP, but really its 163watts.


These are from Techpowerups's measurements of the cards only use under 100% load.

Nvidias claim is off by 70watts. <.<
Amds claim is off by 12watts.



So even if AMD might end up with a 6970 useing 250watts, it ll probably STILL use less than a 480 or 580 does. Beacuse nvidia show their avg. load (subjective depends on how they define avg.) TDP in watts instead of the real maximum it can draw (which amd does(I think)).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The thing is Nvidia are quite... loose with their TDP listings:

Nvidia says 480 uses 250watts TDP, but really its 320watts.
Amd says 6870 uses 151watts TDP, but really its 163watts.

Granted Furmark has nothing to do with real world gaming power consumption for any graphics card. However, 257 watts is still 7W higher than their real world TDP estimate :p

I can't wait until every reviewer drops Furmark and starts using a variety of actual games. Right now a ton of people think they need 750W power supplies. Even AMD laughs at reviewers using Furmark tests ("One was when running OCCT/FurMark, members of AMD&#8217;s &#8220;power virus&#8221; list by virtue of the fact that they put a card under a greater load than AMD believes to be realistically possible.") This is the same myth of Prime95 having a realistic representation for CPU power consumption load.

For example: A HD 4870 X2 (TDP of 286 Watt) consumes 373 Watt with Furmark. Furmark is pretty much useless for real world power consumption measurements. It's a theoretical test of worst case scenario in a lab that has no real world implications for gaming.

The TDP is typically not the most power the chip could ever draw, such as by a power virus (Furmark), but rather the maximum power that it would draw when running real applications.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I think that depends on the price. :)

Also, we're assuming a LOT thinking that Cayman will use as much power as a GTX580. We know AMD is planning on a dual GPU card, that will likely be our 300 watt part. I am willing to bet that Cayman is much closer to 225 watts TDP.

I heard the gtx 580 will use 20% less then a gtx480, and the 6970 about 235 watts.
That seems pretty close to me?
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Not if under furmark the watts a 480 pulls is 320watts vs (not really sure what a 6970 will use). But Amd usually list their furmark ish scores.... while nvidia uses some subjective avg. load.

Unless you have 2 identical systems, running the exact same bench for same time units,... useing avg. TDP doesnt work, because you cant know what anyone clasifies as avg. load. Which is why listning most possible draw makes more sense (even if its unlikely either card reach that outsides of stuff like running crysis or furmark).

My point is... its not really fair to use avg load and put on your box when your competitors arnt doing that. And you cant really use it for anything because its subjective how stressed the gpu is before you say its at avg. load.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Not if under furmark the watts a 480 pulls is 320watts vs (not really sure what a 6970 will use). But Amd usually list their furmark ish scores.... while nvidia uses some subjective avg. load.

Unless you have 2 identical systems, running the exact same bench for same time units,... useing avg. TDP doesnt work, because you cant know what anyone clasifies as avg. load. Which is why listning most possible draw makes more sense (even if its unlikely either card reach that outsides of stuff like running crysis or furmark).

My point is... its not really fair to use avg load and put on your box when your competitors arnt doing that. And you cant really use it for anything because its subjective how stressed the gpu is before you say its at avg. load.

If you were worried about power consumption with the 5xxx series and 4xx series, then you should be less so with the new series as the difference will apparently be smaller.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
^ True. As both companies know, get your review samples out on schedule, and the sales will roll in later.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I heard the gtx 580 will use 20% less then a gtx480, and the 6970 about 235 watts.
That seems pretty close to me?


I thought 244 watt TPD was the number being reported for the GTX480? If that number is true, and the GTX480 has a 250 watt TDP, I would assume it's power use is quite close. It very well may be 20% less power per FPS for the GTX580 compared to the GTX480.

Likewise Cayman may use more overall power, but may still get better performance per watt yet.

We're speculating a lot, though. Launch is supposed to be coming up for both camps pretty soon, so at least than we can all argue/discuss real numbers. :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Although I think all that matters is that it gets out the door 2 weeks + before Xmas, when everyone is in the spending mood.

I think AMD was extremely smart with their launch strategy. It was far more logical to first release HD6850/6870 cards that occupy a very popular $180-240 price bracket (that GTX460 was having all to itself basically) than to push $350+ HD6950/70/90 cards. Without a single killer app/game this holiday season for the PC, it will be far easier to sell 5-10x as many HD68xx series cards for every 1 HD6970 card. AMD probably knows this very well.

NV OTOH is doing the opposite. JHH is still acting like a child, only concerned about outright performance. Instead of concentrating on the GTX560/570 which could have stolen the show, he is pushing a $500 GTX580 for erformance crown that he will lose anyway once 6990 launches.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Granted Furmark has nothing to do with real world gaming power consumption for any graphics card. However, 257 watts is still 7W higher than their real world TDP estimate :p

I can't wait until every reviewer drops Furmark and starts using a variety of actual games. Right now a ton of people think they need 750W power supplies. Even AMD laughs at reviewers using Furmark tests ("One was when running OCCT/FurMark, members of AMD’s “power virus” list by virtue of the fact that they put a card under a greater load than AMD believes to be realistically possible.") This is the same myth of Prime95 having a realistic representation for CPU power consumption load.

For example: A HD 4870 X2 (TDP of 286 Watt) consumes 373 Watt with Furmark. Furmark is pretty much useless for real world power consumption measurements. It's a theoretical test of worst case scenario in a lab that has no real world implications for gaming.

The TDP is typically not the most power the chip could ever draw, such as by a power virus (Furmark), but rather the maximum power that it would draw when running real applications.

GPU computing is a "real application", and in terms of load and power draw, most closely follows furmark, than any game application. I think that using Furmark and Prime95 is in fact very accurate and totally justified, if you want to see a worst-case measurement.

Any card that cannot stand up to furmark is broken, IMHO.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I much rather see worst case power draw, than some subjective "avg load" TPD #s.
Furmark does that, which is why when you list TPD it should be from running furmark and watching how high the #s go.


Watch this video (only differnce between the 2 setups is the cards):
4x480(~1400watts) vs 4x5870 (~900watts)
http://www.youtube.com/watch?v=r-i0WMaYRAM
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I much rather see worst case power draw, than some subjective "avg load" TPD #s.
Furmark does that, which is why when you list TPD it should be from running furmark and watching how high the #s go.

To those that think furmark is a bad way to measure.... crysis can push a 480 thats oc over 400watts too (crysis benchmark, with 4x480 (oc) SLI used over 1600watts, 400watt+ pr card). So its not "just" furmark that causes watts used like that... obviously games can come very close to those numbers too when the gpu is fully stressed.
It is dangerous. It is like a virus that overdraws the circuit and it *can* damage your PSU, MB and/or video card.

Furmark shows *nothing* related to anything you will *ever* encounter in gaming; power draw in any gaming never comes close to anything like Furmark's ridiculous test.
- so what good really is Furmark?
:confused:

Any card that cannot stand up to furmark is broken, IMHO.
What nonsense. Furmark is a broken torture test that shows absolutely nothing related to real world usage. i am not using it in my testing any longer.
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
It is dangerous. It is like a virus that overdraws the circuit and it *can* damage your PSU, MB and/or video card.

Furmark shows *nothing* related to anything you will *ever* encounter in gaming; power draw in any gaming never comes close to anything like Furmark's ridiculous test.
- so what good really is Furmark?
:confused:

The "just in case the next star craft menu screen is broken too" test? :D

Though Larry has a point if one is using the things for folding@home. Nice to know anyway, though simply running F@H as a test itself might be a touch more intelligent.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I think AMD was extremely smart with their launch strategy. It was far more logical to first release HD6850/6870 cards that occupy a very popular $180-240 price bracket (that GTX460 was having all to itself basically) than to push $350+ HD6950/70/90 cards. Without a single killer app/game this holiday season for the PC, it will be far easier to sell 5-10x as many HD68xx series cards for every 1 HD6970 card. AMD probably knows this very well.

NV OTOH is doing the opposite. JHH is still acting like a child, only concerned about outright performance. Instead of concentrating on the GTX560/570 which could have stolen the show, he is pushing a $500 GTX580 for erformance crown that he will lose anyway once 6990 launches.

To be fair, we don't know FOR SURE why NV is doing it the way they are doing. Maybe GTX560 (if it exists) is taking longer for whatever reason. But I do agree that NV is probably rushing out the GTX580 to reinforce its perf crown so that NV has something to talk about on its financial conference call on Nov. 11.

From what I've heard, AMD pushed Barts out first to a) fill the market gap between Cypress and Juniper to better fend off the GTX460 attack, and b) because Cayman is an unproven architecture, so they wanted more time to stomp out the bugs and stuff.
 
Feb 19, 2009
10,457
10
76
Furmark and Crysis stress GPUs similarly. Furmark a bit more, but its always good to ensure your GPUs can take the toughest punishment and still breeze through it fine than to encounter a new software or game that stresses like Furmark in the future and having your card going into meltdown.

IMO, as a hardware review site, max stress testing should be done on cards so that consumers can have confidence in purchasing a product that won't overheat or die on them.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
IMO, as a hardware review site, max stress testing should be done on cards so that consumers can have confidence in purchasing a product that won't overheat or die on them.


that in itself is a valid reason for max stress testing, I think.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
GPU computing is a "real application", and in terms of load and power draw, most closely follows furmark, than any game application. I think that using Furmark and Prime95 is in fact very accurate and totally justified, if you want to see a worst-case measurement.

Any card that cannot stand up to furmark is broken, IMHO.

I have been participating in GPGPU computing for 8 years with AXP1600+, P4 2.6 C @ 3.0ghz, E6400 @ 3.4ghz, Q6600 @ 3.4ghz, Core i7 860 @ 3.9ghz, many laptops over the years, Radeon 4890, etc. in SETI@Home, Milkyway@Home. I can guarantee 100&#37; that GPGPU applications are nowhere near the results you will get with Furmark. I know because I test maximum CPU/GPU temperatures with every new platform I have ever bought. Furmark loads every component in the videocard, which no real world application can do. On the other hand, just because your CPU is Prime95 stable, doesn't mean it's 100% stable in all applications either. Prime95 is not as intense as OCCT or LinX either though.

I have ran Radeon 4890 24/7 in Milkyway@home for 6 months straight. Max temp 82*C vs. 90*C with Furmark at the same fan speed.

that in itself is a valid reason for max stress testing, I think.

Yes. The discussion is about power not overclocking limits, and Furmark relates to using a real world applications to estimate power usage -- Furmark is great for quickly finding overclocking limits, but for GPU power consumption measurements it is worthless. Even AMD has stated that Furmark is a power virus that has no real world implications for power consumption. The manufacturer which made videocards for nearly 20 years....

Furmark and Crysis stress GPUs similarly.

Not even close!! Furmark puts more load on the card than any game ever would. If you want to use Furmark for overclock stress testing, go ahead, because it will help you find your overclock limits quicker. However the discussion we have here is about POWER not overclocking.

Would you test Bugatti Veyron's tires at 290 mph? No you wouldn't because the car can never drive that fast. Furmark's purpose is not to test components for real world usage, but to test beyond real world usage, hence worst case theoretical scenario.

It is dangerous. It is like a virus that overdraws the circuit and it *can* damage your PSU, MB and/or video card.

What nonsense. Furmark is a broken torture test that shows absolutely nothing related to real world usage. i am not using it in my testing any longer.

Respect. :thumbsup: It can still be a great tool for quickly finding overclocking artifacts, but not for power consumption measurements.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Well, i am going to use FurMark one more time. Having a defeat to using it built into a video card is a good reason to stop using it to measure power draw.

However, since i already tested GTX 480, i should test GTX 580 with FurMark .. and i think i found a workaround to Nvidia's drivers id'ing it as such.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I'm not sure why people would doubt that Antilles would be 2 Cayman.

Because of the <300W slides for Cayman?

That just means the GPU will fit in the PCI-E envelope, which isn't surprising.

If you guys remember the first 6870 slides also said <225W or <250W, not sure which, and ended much lower.

Those figures seems to be just an indicator of maximum power connectors it will have.

Right on - it makes perfect sense to repeat the 5970-trick, this time clocking down 2x 6970s, of course.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Antilles even with just 2x6870's running normal clock speeds in 1 card would be good.

Haveing 2x6970's in 1 card.... your talking about 2 GPUs speculated to be 580 performance. It would be a pretty sick grafics card.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
The thing is Nvidia are quite... loose with their TDP listings:

What do you think after initial press releases why literally no OEM picked up their Tegra for smartphones? Yep, you are correct: because they actually test stuff, unlike a gamer, because even an extra watt makes huge difference in thermal, battery etc.
It's very typical Nvidia: let's lie, nobody checks it anyway... those integrators did. AFAIK nobody uses Nvidia anywhere close where power consumption is critical - that's really not NV's forte if I want to put it nicely...
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I wouldnt agree 100&#37;... the 5970 is in general faster, unless your gameing at 1280x1024 or 1680x1050 or something. At 1920x and 2560x res the 5970 in general is a few % faster. The 5970 is like 1 year old... and uses 2x334mm^2 (668mm^2) vs the 580s (520mm^2).

the 580 might be cheaper to make than the 5970... so its a win for nvidia.
Pretty sure amd will stop makeing 5970s soon/if they havnt already.

Still nice to see nvidia 1GPU beat a 2GPU card from 1year ago.
 
Last edited: