The 6970 runs at 880 core, 1536sp's, 250watt TDP, launch date is the 14th in US

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Yes I did say 190 watts for games though in a later post.
AMD seems to have adopted Nvidia's critisized style of power usage also.

I'm not trying to mislead.

I don't see anywhere them adopting the NV method of TDP. The Fudzilla article you linked shows them giving 3 power use numbers.
Most AMD cards, and NV cards in fact, don't actually hit their TDP when gaming, so TDP is above the typical power use.
The only card which 'breaks' this rule is the GTX480, which has a vastly understated TDP figure which doesn't even accurately reflect gaming load situations.

If NV wanted to display (for example) the GTX470 power numbers, they could list it as 30w/200w/220w for idle/typical/max, instead they list it as 215w (IIRC) TDP.

Similarly, AMD could do 16/160/188 for the HD5870, instead they list just 188w.
Now with the HD6970 the rumour is they are listing 3 power use scenarios, which helps them combat NVs false claims for the GTX480 TDP (but only the GTX480), which vastly underrates the card. They could simply list the average like NV kind of did with the GTX480, but instead are giving a fuller picture.
It also helps (IMO) since it gives a more accurate picture of where power use is likely to be.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
quote from article

"We can confirm this story today."

Please comment of the article not wether you think they are true or not.

" Cayman Radeon HD 6970 uses VLIW4 Shader Architecture something that Bobo explain in details here. We can confirm this story today. "

Guess you didn't notice that THIS is a hyperlink to the story that the bolded quote of yours is in ref to.

Quotations taken out of context often mean the wrong thing....Or is that the objective here?

How can a person comment on an article without expressing an opinion?
 

tincart

Senior member
Apr 15, 2010
630
1
0
I am positive they've got the pdf though.

I have tried to express this in the past, but I will re-phrase it in order to be as clear as possible.

Being subjectively positive that something is true, without any attendant evidence, does not make it true nor does it provide justification for anyone else to believe you.
 

Nox51

Senior member
Jul 4, 2009
376
20
81
Thank you for bringing this train wreck of a thread for my morning reading. After yesterday's thread that proved that Fud makes up complete and utter bs we have this facts and figures we have to believe they are the real deal because he says so.

I suggest a reevaluation for your rational system of beliefs and logic is in order.
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
Now i've seen it all

Happy medium just said, "Comment on the article please, not whether you think it's true."

Why do feelings get stronger and rumors more intense the closer we get to actual answers? Read a book, play a game, wait one week, and we'll all know the truth.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Yes I did say 190 watts for games though in a later post.
AMD seems to have adopted Nvidia's critisized style of power usage also.

I'm not trying to mislead.


I don't think people had an issue with Nvidia using a max gaming load or a typical max load. The problem was that they did not seem to include a true max load, what the board would really pull during extreme situations. AMD has a gaming load (which is still well below Nvidia's ~250 watts) as well as a max power use number.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
Wow.

Let's make this simple.

Performance crowns don't win the game, because winning the game for Nvidia and AMD involves making money. The only time a 6970 would be a failure (for us consumers) is if it was slower than a 580 but cost more (and even then, that's contingent on general perception not messageboard enthusiasts). If the 6970 is 5% slower than a 580 and 100 bucks cheaper, that could not possibly be considered a failure of a card from our point of view. Surely the same posters who were hammering 'bang for the buck' in the 460s day aren't going to go back on that now? Some posters here are utterly delusional in their interpretations of this world.
 

Sickamore

Senior member
Aug 10, 2010
368
0
0
What i believe is going to happen. Amd will come out with the same performance at nv but will offer it for less forcing nvidia to bring down the high price card. Graphics card buyers will see for the same performance i can get it cheaper. The exact same thing nvidia did to amd with the 460. Amd will do the same. Its tuff to say amd is going to come out on top in performance with those new cards its going to offer in a few days. The performance of the 5 series from nv is just a shocker.
 

tannat

Member
Jun 5, 2010
111
0
0
Well it's Fuad, he usually does not get it right even if he have the whitepaper tattooed on his forehead...

But when was the last time ATI/AMD did pull out a single chip to fight the top nvidia chip? Why are we so sure they plan to do it now?

It's all about perf/price versus perf/cost and perf/mm2.

If GTX5xx is an improvement of this versus HD6xxx then it is a gain in part for nvidia. Will give them some well earned breathing room probably.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
What i believe is going to happen. Amd will come out with the same performance at nv but will offer it for less forcing nvidia to bring down the high price card. Graphics card buyers will see for the same performance i can get it cheaper. The exact same thing nvidia did to amd with the 460. Amd will do the same. Its tuff to say amd is going to come out on top in performance with those new cards its going to offer in a few days. The performance of the 5 series from nv is just a shocker.

They don't need to be the fastest, they just need to be priced according to performance to sell - something Nvidia started doing with the 460, if you recall.

If the performance of the 5XX series from Nvidia 'shocks you', you must be pretty new to this hardware arena. Nobody else is 'shocked'. So far in this thread you've posted that Nvidia has AMD by the balls and that the performance of the 5XX series is 'shocking'. You're spouting drivel - wake up.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
6970 and 580 will trade some blows.
At the end on average 580 will be slightly ahead.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I don't see anywhere them adopting the NV method of TDP. The Fudzilla article you linked shows them giving 3 power use numbers.
Most AMD cards, and NV cards in fact, don't actually hit their TDP when gaming, so TDP is above the typical power use.
The only card which 'breaks' this rule is the GTX480, which has a vastly understated TDP figure which doesn't even accurately reflect gaming load situations.

If NV wanted to display (for example) the GTX470 power numbers, they could list it as 30w/200w/220w for idle/typical/max, instead they list it as 215w (IIRC) TDP.

Similarly, AMD could do 16/160/188 for the HD5870, instead they list just 188w.
Now with the HD6970 the rumour is they are listing 3 power use scenarios, which helps them combat NVs false claims for the GTX480 TDP (but only the GTX480), which vastly underrates the card. They could simply list the average like NV kind of did with the GTX480, but instead are giving a fuller picture.
It also helps (IMO) since it gives a more accurate picture of where power use is likely to be.

Thanks for that informative post.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
db199ce1c950b658ec2c7e370929f11f_performance.jpg


http://bbs.expreview.com/forum-26-1.html

1601174s9t28nw8pjw0ms4.jpg


162130v3prvi66tj9doiuz.jpg


http://forum.beyond3d.com/showpost.php?p=1501587&postcount=5906
 

ader098

Member
Mar 9, 2010
99
3
66
http://www.overclock.net/11547459-post1.html


Source:http://www.chiphell.com/thread-145436-1-1.html
(Use Google Translate, it is in Chinese)

The source is not in a news format but someone claims to have tested an inhouse 6970. So, take these (dis)information carefully. As NDA will expire on the 13th, everything will be clear soon.

Translation of the main points: (windfire's comment)
a. 6970's power consumption lower than than 580
(due to point h?)

b. idling power consumption is ok

c. stock core and memory clocks are fairly high already, beating 580 in performance slightly
(After the release of 580, AMD recognized that at such a late stage of 6970, it is impractical to make substantial changes to raise the performance of 6970 to compete with 580. The easiest/economical way is to raise the default clockings higher?)

d. cooling is just an extension of the 68xx series, ie too garbage
(the cooling was good enough to match the original clocking of 6970. But at a higher factory clocking, the original cooling becomes just mediocre)

e. temperature is not so desirable
(due to c. and d. above)

f. card is long (>5870) and heavy

g. idling temp = 45C with the card in a NZXT Phantom with all fans on and ambient temp is in the teens, ie 1x C

h. has 6+6 power supply (not 6+8) (this is good news)
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Interesting graphs. Assuming ati have boosted scores a bit too look good (v. likely they are using "default" IQ which is lower then nvidia's but boosts fps) then it looks like 6970 is about 10% faster then the GTX 570, and about 10% slower then the GTX 580 (using fairer HQ settings).

One would assume the 6950 will be a fair amount slower then the GTX 570 as it has been more seriously cut back (570 has 94% of 580's shaders, 6950 is only meant to have 80% of 6970's shaders).

On the plus side for ati as this is a new architecture I'd expect ati to be able to improve performance more then nvidia in the future.

End result - 5970 will compete with GTX 580, but nvidia's going to have the edge with the GTX 570 vs 6950. Software (drivers and features) are going to be better with nvidia too as despite what the fan boys may claim they always are. Power usage similar. Not enough for ati to equal nvidia in sales, but enough to compete.

There's still the odd unknown - fan noise for example matters quite a bit, nvidia did a good job here. Ati not so sure as they probably had to raise clocks to the max to compete with the unexpectedly fast 580, so it's probably pushing cooling to the limit.
 
Last edited:
Feb 19, 2009
10,457
10
76
There is something really fishy going on.. i suspect we are in for a last minute "SURPRISE!" from AMD.

For the time being, i'll take released slides with a huge grain of salt.

Edit: To the above chiphell leak, guy says the 69xx series can't run with older cat 10.11. Crashes in a lot of games and poor performance.
 
Last edited:
Sep 19, 2009
85
0
0
So it is going to be 35% faster than Barts? Really doubt it.

Oh, and then HD 6950 is about as fast as Cypress.

If true, it better have a hell of performance/$ and performance/W.
 

RaistlinZ

Diamond Member
Oct 15, 2001
7,470
9
91
I'm feeling underwhelmed by these early charts. Bring on the 6990 GTX595 already! I need something to get excited about.

What I'm real interested to see is 2x6970 vs. 2x580 performance.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Thanks for you input.

You're quoting the wrong guy, I didn't write that. But, you're welcome :)

Also, from the look of those charts, it almost seems like the only surprise we're going to get from AMD is that these cards are still using VLIW5. It happened for Barts when everyone was convinced we moved to 4D, I don't see why it can't happen with Cayman. I did the math back when Barts was being rumored, and an increase in shader count plus a move to 4D typically equated around 50%+ more shader power. The graphs showing Cayman to be 35% faster than Barts directly correlates to it having near exactly 35% more shaders and an increase in fixed function units.

So, either we're still looking at VLIW5, or I don't believe the charts.
 
Last edited: