AMD HD7*** series info

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

WMD

Senior member
Apr 13, 2011
476
0
0
Spec comparison: https://docs.google.com/spreadsheet...LTJKX0h5RGVCdXc&single=true&gid=0&output=html
Performance: http://www.techpowerup.com/reviews/Powercolor/HD_6850_SCS3_Passive/27.html

Assuming the specs are accurate, we can pretty much compare the 7870 to the 5850 using the current 6970 as a bridge. The 7870 should be approximately 5-10% faster than the 6970. The 6970 is currently approximately 40% than the 5850. So it should end up about 50% faster than the 5850.

If those specs are true. Even 5850 - 7870 may not worth the upgrade. 7870 has the same specs as a 6970 which performs more like a slightly overclocked 5870 in non tessellated benchmarks. Those people with reference 5850s easily clock beyond 5870 speeds.
 
Feb 19, 2009
10,457
10
76
Does maths even work like that, multiplication of percentage and not additive.

IF the 7870 is 10% faster than a 6970.
So if 6970 gets 100 fps, 7870 gets 110fps.

The 6970 is 50% faster than a 5850 in modern dx11 games. No point including older dx9 games.
So 5850 gets ~66fps compared to 6970 100fps.

66 fps vs 110 fps = ??

Confusing
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Does maths even work like that, multiplication of percentage and not additive.

IF the 7870 is 10% faster than a 6970.
So if 6970 gets 100 fps, 7870 gets 110fps.

The 6970 is 50% faster than a 5850 in modern dx11 games. No point including older dx9 games.
So 5850 gets ~66fps compared to 6970 100fps.

66 fps vs 110 fps = ??

Confusing

Of course the math works in my example. I just rounded to the nearest 5 or 0, as we are dealing with approximations, estimations, and a performance range anyway.

But for your example it works.

1.5 * 1.1 = 1.65 or 65% faster
110 fps / 66.6... fps = 1.65 or 65% faster.
66.6... fps * 1.5 [6970] * 1.1 [7870] = 110 fps.

If those specs are true. Even 5850 - 7870 may not worth the upgrade. 7870 has the same specs as a 6970 which performs more like a slightly overclocked 5870 in non tessellated benchmarks. Those people with reference 5850s easily clock beyond 5870 speeds.
Overclocking always changes the math.
 
Last edited:

Saico

Member
Jul 6, 2011
53
0
0
Guys,
Let me be blunt here, THERE IS NO XDR2 IN SI/HD7000/GCN. Trust me on this, the spec list floating is complete bull, and you can tell by who is re-posting it and who is not. Some people know, and they are being VERY quiet on the subject.
This is not meant to knock XDR2 and/or Rambus, it is just a statement about what is in and what is not in the next GPU.

-Charlie

I hope he didn't break NDA on this.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Depends on price, but this is pretty much what I've been waiting for. The people who need >580GTX speed are also looking at astronimcally expensive displat setups, huge power bills, etc... These people are dumping loads of cash into their computers.

People in the 1920x1200 or 1080p resolutions don't need faster than current cards for very high quality, they just need cheaper and lower power... at least until software starts catching up.

I never really expected my 5770 to be usable for as long as it has been. At the rate things have slowed to, a 5850 was a pretty good midrange buy for a good 2 years, and if purchassed at launch, before AMD pumped up the price, it's really not far from being price competitive with current offerings at similar performance.

It's possible a 7850 will have me happy for 3 years?!? That's just wack given how fast I was swapping cards in and out in the 9700 pro --> 7900gt timeframe.

I'm highly considering jumping up to something beyond my 5850 when BF3 hits, but I have this bit of a feeling that it will handle BF3 well, especially if EA/DICE partners with AMD and not Nvidia. That would mean the tessellation modes won't purposely kill off below-top end Nvidia parts and in turn ALL AMD GPUs.
 

WMD

Senior member
Apr 13, 2011
476
0
0
XDR2 is costly next gen stuff. AMD will stick to tried and proven 256bit GDDR5 which has worked fantastically well for them the past 4 years.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Does maths even work like that, multiplication of percentage and not additive.

IF the 7870 is 10% faster than a 6970.
So if 6970 gets 100 fps, 7870 gets 110fps.

The 6970 is 50% faster than a 5850 in modern dx11 games. No point including older dx9 games.
So 5850 gets ~66fps compared to 6970 100fps.

66 fps vs 110 fps = ??

Confusing

He said HD6970 is ~40% faster than HD5850 and that HD7870 could be 10% faster than HD6970, Edit: then HD7870 will be 50% faster than HD5850.

If HD5850 gets 66FPS then HD6970 will be 40% faster then HD5850 at 66FPS+40%= 92,4FPS and HD7870 will be 50% faster then HD5850 = 66FPS+50% = 99fps

Or

If HD6970 gets 100FPS then HD5850 will be at ~71,43FPS and HD7870 will be at 110FPS (10% faster).

Confusing ?? :confused: i know ;)

Edit 2 : 66fps vs 110fps = 110fps is 66,66% faster than 66fps
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
XDR2 is costly next gen stuff. AMD will stick to tried and proven 256bit GDDR5 which has worked fantastically well for them the past 4 years.
8000mhz and above XDR2 has been out at least 5 years so surely costs have become reasonable by now. and where else can they go with gddr5 on a 256bit bus? the fastest available is 7000mhz so even if they ran it at full speed that would only mean 27% increase in bandwidth. and realistically they would run it at least 200-300mhz below that resulting in less than 25% bandwidth increase for a next gen product.
 

Mopetar

Diamond Member
Jan 31, 2011
8,494
7,751
136
XDR2 is costly next gen stuff. AMD will stick to tried and proven 256bit GDDR5 which has worked fantastically well for them the past 4 years.

The dove into using GDDR5 once upon a time, so if they need the added performance, there's no reason why they won't move to XDR2 as long as they cost isn't too high or some other factor prevents them from doing so.
 

WMD

Senior member
Apr 13, 2011
476
0
0
8000mhz and above XDR2 has been out at least 5 years so surely costs have become reasonable by now. and where else can they go with gddr5 on a 256bit bus? the fastest available is 7000mhz so even if they ran it at full speed that would only mean 27% increase in bandwidth. and realistically they would run it at least 200-300mhz below that resulting in less than 25% bandwidth increase for a next gen product.

I have thinking along that line sometime ago. That memory bandwidth would be a bottleneck. But things are not as simple as that. The 5870 have only 30% more memory bandwidth as the 4870 but almost 2X as fast. Tests already shown that the card is not really bandwidth starved as many would have believed. If they run samsung's 7Gb/s chips for next generation that will be an increase of 40% memeory bandwidth even if they don't clock it to the max.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I have thinking along that line sometime ago. That memory bandwidth would be a bottleneck. But things are not as simple as that. The 5870 have only 30% more memory bandwidth as the 4870 but almost 2X as fast. Tests already shown that the card is not really bandwidth starved as many would have believed. If they run samsung's 7Gb/s chips for next generation that will be an increase of 40% memeory bandwidth even if they don't clock it to the max.
what? 40%? they are using 6000mhz memory now and running at 5500mhz. even 7000mhz memory running at full speed would only be 27% faster than 5500mhz.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The 6970 is 50% faster than a 5850 in modern dx11 games. No point including older dx9 games.

Not even close!!! :eek: I can't even believe this was taken seriously in this thread.

DX11 @ 1920x1080 = HD6970 is 34% faster than a stock HD5850.

HD6970 is 11-12% faster in DX11 over HD5870 (unless you are benchmarking Unigine Heaven).

Of course HD6970 hardly oveclocks beyond 880mhz on reference design, while HD5850 has 25-30% overclocking headroom. It only takes 850mhz on the HD5850 to match an HD5870. And beyond that, you are only closing the gap on the HD6970.

You may want to take a look at this review where an overclocked HD5850 whoops the stock HD5870.

If HD7870 is only 10% faster than an HD6970, then for those with overclocked HD5850s @ 850mhz or above, that's not much of an upgrade, certainly not worth spending $200-250 for another 20-25% more performance. And if your HD5850 has higher clock speeds than 850mhz, the gap is even smaller. Of course you can also overclock the HD7870, but I would say the minimum worthwhile upgrade for HD5850 users will be the HD7950 unless you think dropping $200+ for an extra 25% in GPU performance is worthwhile over a 2 year old videocard........you can't be serious?
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
8000mhz and above XDR2 has been out at least 5 years so surely costs have become reasonable by now. and where else can they go with gddr5 on a 256bit bus? the fastest available is 7000mhz so even if they ran it at full speed that would only mean 27% increase in bandwidth. and realistically they would run it at least 200-300mhz below that resulting in less than 25% bandwidth increase for a next gen product.

I don't think current high end cards are bandwidth starved, and even if the memory bandwidth is only a modest 25% increase in next gen parts, I still don't think they'll be bandwidth starved with most situations when new high end parts come out.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Of course HD6970 hardly oveclocks beyond 880mhz on reference design, while HD5850 has 25-30% overclocking headroom. It only takes 850mhz on the HD5850 to match an HD5870. And beyond that, you are only closing the gap on the HD6970.

Just out of curiosity, where do you get this? I bought a 6970 a week after its release and it overclocked to 950/1450 without a blink of an eye, default voltage. I've run it at that clock nearly 24/7 since I got it with no issues in games. Have a couple buds with similar lack of problems. Only thing is I have to manually set the fan, since CCC's fan throttling is garbage.

I've seen stories at overclock.net of people getting to 1000 GPU without any problems, and i'm pretty sure mine could push further If I tried. are others having issues (just curious). Maybe its a problem with a specific brand? hmm
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Just out of curiosity, where do you get this? I bought a 6970 a week after its release and it overclocked to 950/1450 without a blink of an eye, default voltage. I've run it at that clock nearly 24/7 since I got it with no issues in games.

950 vs. 880 is only an 8% overclock. Of course HD6970 won't scale linearly. In fact its scaling is pretty horrendous.

http://www.xbitlabs.com/articles/graphics/display/msi-vt3d-radeon-hd6970_6.html#sect3

So basically you may get up to 8% faster performance with a 950mhz 6970, but far less on average. HD5850 @ 850mhz is already ~ 5870. Basically, there is no way upgrading from an overclocked 5850 is going to be worth it to a $200+ 7870 unless 7870 has 20-30% overclocking headroom.

Of course, we don't really know the actual specs of the 7870 (so just conjecture).
 

WMD

Senior member
Apr 13, 2011
476
0
0
Just out of curiosity, where do you get this? I bought a 6970 a week after its release and it overclocked to 950/1450 without a blink of an eye, default voltage. I've run it at that clock nearly 24/7 since I got it with no issues in games. Have a couple buds with similar lack of problems. Only thing is I have to manually set the fan, since CCC's fan throttling is garbage.

I've seen stories at overclock.net of people getting to 1000 GPU without any problems, and i'm pretty sure mine could push further If I tried. are others having issues (just curious). Maybe its a problem with a specific brand? hmm

That is a great overclock. How much do you get in 3Dmark Vantage gpu score for stock and overclock.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
a more recent review: july 2011

http://www.guru3d.com/article/his-radeon-6970-iceq-mix-review/18

crysis 2 and metro.

around 50%.

First of all, in Metro 2033 the performance is unacceptable on the 5850/5870/6970 in the benches you linked:

Metro 2033
5850 = 19
5870 = 23
6970 = 28 (still unplayable)

^ You need at least HD6870 in CF to break 40 fps avg, which is still barely acceptable imo for a FPS. So there is simply no way that a 70mhz faster clocked HD6970 (i.e., 7870) is going to make this game more playable for an HD5850 user at 1920x1200 at the settings they used.

In Crysis 2, there is no 5850 linked, but there is the 5870:
5870 = 32
6970 = 36 (+12.5%)
:hmm:

^ Both are still way too slow. It's unlikely HD7870 is going to be getting 50-60 fps here. In other words, HD7870 will still not be good enough to make Crysis 2 playable at 1920x1080 if its a slightly refreshed 6970.

Plus, if you are going to strictly discuss DX11 performance, then why in the world would an HD5850 user upgrade to an HD7870 for DX11 performance when NV has a huge lead in Crysis 2? If you are going to discuss the merits of upgrading solely for DX11 with Tessellation in games like Metro 2033/Crysis 2, the most logical scenario is to either get an HD7950 (which hopefully has fixed AMD's Tessellation deficit courtesy of GCN architecture), OR wait for GTX600 series which will probably double HD6970's Tessellation performance.

Secondly, no one here said anything about a stock HD5850. Notice the HD6970 is only 21% faster than an HD5870 in Metro 2033, but it still makes the game horribly slow. And in Crysis 2, you gain 4 fps ..... wow!!!!!!!!

Bottom line is, if HD7870 is a refreshed 6970 with 70mhz higher GPU clock speeds, it won't make any game more playable by a lot more than 1 filter stage (i.e., 2AA --> 4AA or 4AA --> 8AA). Sorry, but that's simply not worth the $200 price for most people with an overclocked HD5850.
 
Last edited:
Feb 19, 2009
10,457
10
76
Im not discussing the merits of it, it was to work out how maths should be done properly on percentages. :)

Your point is very true. It doesn't make sense to release a 28nm mid-range that performs near 6970 speeds since 6xxx series was a compromise. Thus, the true 28nm mid-range should be compared to 6970 if it was made on 32nm.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Bottom line is, if HD7870 is a refreshed 6970 with 70mhz higher GPU clock speeds, it won't make any game more playable by a lot more than 1 filter stage (i.e., 2AA --> 4AA or 4AA --> 8AA). Sorry, but that's simply not worth the $200 price for most people with an overclocked HD5850.

edit: I guess i'll have to agree with this. I'm hoping AMD really pulls ahead with tahiti-xt though, they'll have a pretty big lead on nvidia if so. The future of PC gaming is scary right now , PC game sales are lackluster and the industry is driven by AAA titles which forces developers to make cross platform games or not get published. Piracy for single player games is also ridiculously high. Guess thats probably why nvidia is so enamored with their mobile strategy now.
 
Last edited:

WMD

Senior member
Apr 13, 2011
476
0
0
a more recent review: july 2011

http://www.guru3d.com/article/his-radeon-6970-iceq-mix-review/18

crysis 2 and metro.

around 50%.

Metro 2033 at highest settings with ADOF strongly skews the results with its >1GB VRAM requirement. Not to mention at that setting its unplayable for any single card. For best tessellation performance go Nvidia. In most other recent games the gain from 5850 to 6970 upgrade is closer to 25-30%

Crysis 2 Dx11 30%
http://gamegpu.ru/Action-/-FPS-/-TPS/Crysis-2-v-rezhime-DirectX-11-test-GPU.html

Deus Ex 27% gain
http://gamegpu.ru/Action-/-FPS-/-TPS/Deus-Ex-Human-Revolution-test-GPU.html

BF3 30% gain
http://gamegpu.ru/Action-/-FPS-/-TPS/Battlefield-3-Alpha-test-GPU.html

Witcher 2 28% gain
http://gamegpu.ru/RPG/Rollevye/The-Witcher-2-Assassins-of-Kings-versii-1.2-tect-GPU.html

Dirt2 30% gain
http://www.tweakpc.de/hardware/test...d_6970_hd_6950/benchmarks.php?benchmark=dirt2

BFBC2 23% gain
http://www.tweakpc.de/hardware/test...d_6970_hd_6950/benchmarks.php?benchmark=bfbc2

In non DX11 games then the difference is even less at 23%
http://www.tweakpc.de/hardware/test..._hd_6970_hd_6950/benchmarks.php?benchmark=pfd

Also consider an overclocked 5850 scores about the same as a stock 6970 in 3Dmark Vantage GPU score. Such an upgrade may not be well the money for 10% -20% performance gain.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
8000mhz and above XDR2 has been out at least 5 years so surely costs have become reasonable by now. and where else can they go with gddr5 on a 256bit bus? the fastest available is 7000mhz so even if they ran it at full speed that would only mean 27% increase in bandwidth. and realistically they would run it at least 200-300mhz below that resulting in less than 25% bandwidth increase for a next gen product.

Is there anything around today that is using XDR2 ?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Im not discussing the merits of it, it was to work out how maths should be done properly on percentages. :)

Your point is very true. It doesn't make sense to release a 28nm mid-range that performs near 6970 speeds since 6xxx series was a compromise. Thus, the true 28nm mid-range should be compared to 6970 if it was made on 32nm.

Pricing may play a huge role. If they manage to release a 120W HD7870 $239, that's not a great upgrade for 5850 users, but a huge upgrade for 4870 users who didn't quite want to spend $370 on the 6970 when it launched. Also, those prices would fall quickly with rebates and we could be seeing a $199 HD6970+ performance with half the power consumption. That would be pretty good for a mid-range card. However, I am much more interested in what 7950/7970 will offer since I want to see how good GCN is.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
If HD7870 is ~30% faster than HD5850 it's nice, it is not the killer performance some would think, but remember that HD5770 was slower than HD4870/90.

HD7870/50 will have lower power usage and same or better OC headroom but the price has to be lower of $200.

HD7850 at $150 will be a much better upgrade for HD57xx/67xx owners.

HD5850/70 owners better go with HD7950 at ~$250-300