Possible hd8000 specs and die sizes

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
http://uk.hardware.info/news/31864/possible-specs-amd-radeon-hd-8000-surface

If these rumors are to be believed, then it's likely Sea Islands will be more than some small tweaks and slight performance bump that Cayman was over Cypress. 2560 cores (a 25% bump), probably a slight improvement in perf/watt.... we could be looking at some seriously powerful GPU's on the same node process. Also, I can't find where it was said, but the rumored die size of the highest end hd8000 GPU is ~410mm^2, the biggest GPU made by AMD/ATI in a long, long time.

Bring on the the GPU's! I'm hoping for some early price wars.
 

Mopetar

Diamond Member
Jan 31, 2011
8,517
7,777
136
I would imagine that they managed to get the power draw under more control if they're going to add that much additional hardware (also 50% more ROPs) and it would also suggest that they're more comfortable with the new process if they're going to be building bigger chips.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The question remains whether these leaked specifications can be trusted. The website refers to ATI graphics cards, and that brandname has not been used for more than two years since AMD acquired it. Also, the information about the mid-range HD 8800 resembles specifications that were leaked in September. Lastly, the Spanish version of BitDreams is unknown to us, and as far as we know not really an autority when it comes to hardware news and information. The chance that these specifications are accurate is about 50/50 we'd say. In any case, thanks to Fred Brabander for tipping us off.

It's not time for silly season again already.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Certainly blows the hell out of those 15% figures. Itd be nice if amd forced nvidia to stop slacking in the gaming market.
 

nforce4max

Member
Oct 5, 2012
88
0
0
It is nice to see the Red team finally putting out a true mega chip since the days of the 2900xt and if the clocks hold out to be true as well it could be a hugely popular choice in the high end. As for me and my small budget will be aiming for the 8770 or 8850 if the price of first gen GCN isn't low enough.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
294mm^2 die on a cheap pcb for 500$, surely they can do much better. Lets hope so at least.

They dont seem to care about the single gpu performance crown anymore, cutting out the top end and giving us a sub 200watt part kinna blows.

Wow I have heard it all. People were here outing small AMD die sizes as features for the past several years, as if a die size at all mattered to end users. Now people are pissed at how small nvidia dies are, again as if it matters at all. I guess it's just hip to hate nvidia chip layouts. :rolleyes:

It wasn't until October that AMD definitively gained outright single gpu performance. Hardocp just tested gtx680 sli vs. hd7970ge cfx and said the sli setup provides the best experience. Nvidia is not at all slacking. If they were, all the marketing in the world wouldn't matter.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
The chance that these specifications are accurate is about 50/50 we'd say

Yea... it can be accurate or inaccurate, one out of two... LOL

AMD don't make this fancy 192/448 bit bus. I don't think they will change that this time. Even if they would do this, there will be other memory capacity. If 128 bit has 2GB, then 192bit will have 3GB.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
Nvidia is not at all slacking. If they were, all the marketing in the world wouldn't matter.
yes the hell it does ! haven't you read R.s's ridiculously detailed posts illustrating what happened in the 5800, 280, and 480 generations where in each gen nvidia was slacking SEVERELY in a truly relevant metric yet still garnered more sales ?
Even in this gen, there were only three months where the 680 was a better choice than the 7970 (even then the 7970 had bitcoin mining to still make it a worthwhile purchase) out of a total of almost twelve and guess who still sold more . and the reason why is marketing . come on man you know better than to think that .
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Getting over 400mm2 in AMD's precarious financial situation, sadly I don't think so. I would love a single card to replace my set-up and not give up top-end performance.
 
Feb 19, 2009
10,457
10
76
They should have done this a long time ago, top dog status commands top pricing, and margins for the entire lineup. Making your brand line up with "the best" status quo in consumers mind is what Intel and NV has done, and even when they failed to deliver as history recalls, they still sold heaps of inferior products for good $$. In this sense, they can't fail even when they do.

AMD is synonymous with budget and value, not "the best" by a long shot and they suffer for this with lower sales volumes and much less margins.
 
Feb 19, 2009
10,457
10
76
Bring on the the GPU's! I'm hoping for some early price wars.

All great and good, but would you ever buy an AMD GPU? If it was significantly faster, comparable pricing to NV. Would you buy it? Or would you wait awhile hoping NV reduces their prices further to match the perf/$ and then buy the NV GPU?
 

hjalti8

Member
Apr 9, 2012
100
0
76
Wow I have heard it all. People were here outing small AMD die sizes as features for the past several years, as if a die size at all mattered to end users. Now people are pissed at how small nvidia dies are, again as if it matters at all. I guess it's just hip to hate nvidia chip layouts. :rolleyes:

Of course it matters. Larger die usually equals more performance and more overclocking headroom due to lower clocks.
 

Mopetar

Diamond Member
Jan 31, 2011
8,517
7,777
136
All great and good, but would you ever buy an AMD GPU? If it was significantly faster, comparable pricing to NV. Would you buy it? Or would you wait awhile hoping NV reduces their prices further to match the perf/$ and then buy the NV GPU?

Brand loyalty?

Some people have had good experiences with some brands and like to continue doing business with them, even if they don't provide the objectively best deal available at the moment.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
Brand loyalty?

Some people have had good experiences with some brands and like to continue doing business with them, even if they don't provide the objectively best deal available at the moment.
doing that generation after generation where the other side consistently offers better value is quite frankly stupid . Look where that has gotten us : the entire 600 series lineup does not support voltage modification and the vrm and cooling design on the top end gtx 680 and 670 is horrible ! Brand loyalty has allowed such faulty design to flourish and thrive and will continue to as long as you and the others who go with the okey doke allow it
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,688
2,581
136
Yea... it can be accurate or inaccurate, one out of two... LOL

Yep. I think this is total bs. However...

AMD don't make this fancy 192/448 bit bus. I don't think they will change that this time. Even if they would do this, there will be other memory capacity. If 128 bit has 2GB, then 192bit will have 3GB.

As far as I understand it, right now the GCN architecture would allow a bus width of anything that is divisible by 64 bits. There's no reason they couldn't do that, they just haven't built an SKU with those specs.

Wow I have heard it all. People were here outing small AMD die sizes as features for the past several years, as if a die size at all mattered to end users. Now people are pissed at how small nvidia dies are, again as if it matters at all. I guess it's just hip to hate nvidia chip layouts.
:rolleyes:

You do realize that it's not the same people saying that? The people who wanted more efficient chips with smaller dies are now pretty much happy with GTX680, but the people who used to like the large-die GPUs from nV have been let down. nV is "slacking", because it's well known that they could trivially build a GPU that is ~50% or more faster than GK104. We know this, because they have done so, with the GK110. The choice not to market it for gamers is entirely a marketing one, and leaves a lot of people sour.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
All great and good, but would you ever buy an AMD GPU? If it was significantly faster, comparable pricing to NV. Would you buy it? Or would you wait awhile hoping NV reduces their prices further to match the perf/$ and then buy the NV GPU?

I've owned a Radeon in the past, if that is what you are trying to get at. But to answer your question directly, I would not be against buying an AMD card. I would, however, probably not at all consider purchasing an AMD card if their financial situation continues it's course.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
. nV is "slacking", because it's well known that they could trivially build a GPU that is ~50% or more faster than GK104. We know this, because they have done so, with the GK110. The choice not to market it for gamers is entirely a marketing one, and leaves a lot of people sour.
I hate it that NV decided to not give us their big dice and throw us scraps in the form of overclocked mid-range cards. If that wasn't bad enough they even forbid us from playing with their cards by blocking the voltage and people still bought them in droves, even informed people on this forum. Maybe if people didn't pay top dollar for their mid-range cards it would send them a clear message to release their big dice for the consumers. Marketing trumps all. I, for one, won't pay 500$ for an overclocked mid-range card with blocked voltage control. If it was 300$ and clocked at 750MHz with similar overclocking potential to GTX 460 or 7950 then it would be a great deal. Why people allow NV to screw them over is beyond me. And they even feel they got a great deal :D
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
I've owned a Radeon in the past, if that is what you are trying to get at. But to answer your question directly, I would not be against buying an AMD card. I would, however, probably not at all consider purchasing an AMD card if their financial situation continues it's course.

Huh?....wut?o_O
Don't worry....AA and AF still work even if they have a bad quarter...:D
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It wasn't until October that AMD definitively gained outright single gpu performance.

You keep saying this over and over and it's incorrect. AMD definitively got the single-GPU performance crown in June 2012. Please go read 10-15 reviews when HD7970 Ghz launched. I am not going to repost them here again for the nth time. You can start with HD7970 Ghz reviews around June 21-22. Also, you conveniently ignored 1200mhz Sapphire TOXIC that trounced the fastest GTX680 a long time ago. With the latest drivers, AMD extended the lead, not finally gained the lead. If you look at recent reviews and compare them to June 2012, the gap has actually widened in favour of HD7970Ghz, especially at 2560x1600 where the lead is now more than 10%. Additionally, it's impossible to even talk about GTX680 being on the same level as an HD7970Ghz ever since the Asus Matrix 7970 came out. I mean it's now miles apart.

As far as those specs go, they are fake. There are major errors in that chart. Here are just a few:

1) HD8950 = 1050mhz x 2304 SP x 2 Ops = 4.84 Tflops SP vs. 4.5 in the chart.

2) Double precision specs are all wrong. HD8970 has 5.38 Tflops SP but they list DP as 1.6 Tflops. That's impossible since GCN has 1/4th DP. Even if you use 1/3rd it doesn't work.

3) HD8970 memory bandwidth = 6000mhz x 384-bit bus = 288GB/sec vs. 322GB/sec in the chart.

4) 140 TMUs for HD8950. That's impossible. Each CU is also equipped with 4 texture units & 64 Stream processors. To get 2304 SPs for HD8950, you need 36 CUs or 144 TMUs.

5) 50W power increase separates HD8870/8950/8970 despite a gigantic gap in specs between 8870 and 8950. This makes no sense since their clock speeds are so close yet the power consumption penalty is the same between each SKU step. This is not logical.

6) HD8990 with 375W of power? AMD couldn't get HD7990 out with 925mhz HD7970 x 2. Suddenly they are going to launch HD8970 x2 but a single HD8970 has an even higher TDP than an HD7970? Right...

I am not going to bother looking for more errors. These specs are rubbish. Some of those #s may be right but only by coincidence. The person who put that chart together doesn't understand how GCN architecture even works since even basic spec ratios don't align with how GCN is put together.

I would, however, probably not at all consider purchasing an AMD card if their financial situation continues it's course.

That makes a lot of sense considering you keep your GPUs for 3-5 years before upgrading. :thumbsup:

Your post would be a lot more genuine if you said you won't leave NV due to PhysX, TXAA, 3D vision, NV Inspector/driver preferences, or just brand loyalty. You realize who AMD's largest shareholder is? AMD can always tap Mubadala for emergency funds. I guess from now on when we are buying trivial items that cost $300-400 we should do thorough due diligence on the company's balance sheet, income statement, build a DCF, consult industry and strategy experts on the company's going concern, double check its credit rating, perform a probability analysis on the chance of it failing in the next 1-5 years, etc. Sounds like a genuinely worthwhile analysis to hedge against a $400 GPU "investment" even though we'll probably upgrade in the next 2 years anyway to the next best thing for next gen games.
 
Last edited: