[Rumor]R9 300 series will be manufactured in 20nm!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
A $199 Tonga XT with 2048 cores and 1Ghz clocks would wipe out GTX960. This is a risk strategy though since consumers for the first time in decades are foregoing massive performance advantages AMD offers in favour of perf/watt. So it's possible this strategy will backfire badly because even a card 50-60% faster than the 960 4GB doesn't sell as well.....

Efficiency is the new focus.

R290 with faster than Tonga performance for $240 isn't a bargain compared to 960 for the market. That's how much they value efficiency.

Until AMD can match NV on efficiency, they will be forced to sell faster parts for less.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Efficiency is the new focus.

R290 with faster than Tonga performance for $240 isn't a bargain compared to 960 for the market. That's how much they value efficiency.

Until AMD can match NV on efficiency, they will be forced to sell faster parts for less.

And that is OK. I think someone with a strong background in finance should be making decisions at AMD. Look at Nintendo. They completely flopped with the Wii U but every single console and game they sell at a profit. That means Nintendo is actually making $ selling each console and corresponding games, while MS's division keeps bleeding $ with Xbox. Sometimes it's better to have low market share and no popularity but sell products at a profit to survive until the next generation where you will have a chance to regroup.

Even if AMD's market share goes to 5%, as long as their GPU division is making profits, it's better than to have 35% market share and lose millions. I think AMD really just needs to survive until 1H of 2017 because I don't see their financial woes changing at all over the next 4 quarters. AMD has stated that they will have 2 new revenue streams, with a combined value of $1 billion. I believe these are design wins with MediaTek and new Nintendo console products. However, these won't even show up until late 2016. Right now, why waste hundreds of millions of dollars on redesigning the entire GCN stack on the ancient 28nm node when AMD knows that the real new generation is 14nm/16nm node? Would you redesign the whole stack knowing how short-lived this generation will be? Nope. NV is in a different position since they worked on Maxwell for 3-4 years which means GK206, 204 and 200 all received the attention they deserved. I can't possibly see AMD doing that level of SKU re-design this round. They should just focus on R9 380/380X/390/390X and re-badge everything below. Just throw R9 270 as R9 360 for $99, R9 270X as an R9 360X at $119, R9 285 as R9 370 at $149, 2048 SP Tonga XT 4GB as R9 370X at $199 and be done with it. I would love to be wrong but I don't see AMD doing anything dramatic on the low end.

I think considering just HOW late AMD is with this round, they better preserve as much cash flow as possible for 14nm/16nm against Pascal. I pretty much call this generation a write-off for them. They are almost a year late to 750/750Ti and soon will be a year late to 970/980, and 6 months late to a 960. Might as well just sell whatever they can at profits and establish strong image with R9 380-390 series. Maybe they can even throw R9 280X 3GB for $149 and wreck total havoc on the entire market at that level.
 
Last edited:

smackababy

Lifer
Oct 30, 2008
27,024
79
86
Sorry guys, can't give out my sources, but they confirm R9 300 series will be more stronk than over 9000 of anything team green can come out with!
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
And that is OK. I think someone with a strong background in finance should be making decisions at AMD. Look at Nintendo. They completely flopped with the Wii U but every single console and game they sell at a profit. That means Nintendo is actually making $ selling each console and corresponding games, while MS's division keeps bleeding $ with Xbox. Sometimes it's better to have low market share and no popularity but sell products at a profit to survive until the next generation where you will have a chance to regroup.

Even if AMD's market share goes to 5%, as long as their GPU division is making profits, it's better than to have 35% market share and lose millions. I think AMD really just needs to survive until 1H of 2017 because I don't see their financial woes changing at all over the next 4 quarters. AMD has stated that they will have 2 new revenue streams, with a combined value of $1 billion. I believe these are design wins with MediaTek and new Nintendo console products. However, these won't even show up until late 2016. Right now, why waste hundreds of millions of dollars on redesigning the entire GCN stack on the ancient 28nm node when AMD knows that the real new generation is 14nm/16nm node? Would you redesign the whole stack knowing how short-lived this generation will be? Nope. NV is in a different position since they worked on Maxwell for 3-4 years which means GK206, 204 and 200 all received the attention they deserved. I can't possibly see AMD doing that level of SKU re-design this round. They should just focus on R9 380/380X/390/390X and re-badge everything below. Just throw R9 270 as R9 360 for $99, R9 270X as an R9 360X at $119, R9 285 as R9 370 at $149, 2048 SP Tonga XT 4GB as R9 370X at $199 and be done with it. I would love to be wrong but I don't see AMD doing anything dramatic on the low end.

I think considering just HOW late AMD is with this round, they better preserve as much cash flow as possible for 14nm/16nm against Pascal. I pretty much call this generation a write-off for them. They are almost a year late to 750/750Ti and soon will be a year late to 970/980, and 6 months late to a 960. Might as well just sell whatever they can at profits and establish strong image with R9 380-390 series. Maybe they can even throw R9 280X 3GB for $149 and wreck total havoc on the entire market at that level.

complete rubbish. If AMD is losing money at 25% market share how the heck are they going to make any money at 5%. AMD needs a new GPU stack to gain back market share and stop the bleeding. AMD is profitable as long as they have a market share of 35 - 40%. AMD will strive to provide competitive products and get back to having 35 - 40% market share. AMD will focus on the notebook GPU market where they have the lowest share. I am sure that R9 380X aka FijiXT will sport HBM as the benefit of HBM is most needed in notebook GPUs where AMD is power constrained. Nvidia with their superior memory bandwidth efficiency have been able to avoid the need for HBM with the GM204.

btw your theory of rebadges is horrible. Tonga is a proof of concept GPU. Its not a balanced GPU as it does not improve perf /sq mm over Tahiti which launched 32 months earlier. Its not a cost effective GPU at 360 sq mm. Selling a 360 sq mm GPU at USD 149 - 199 is not good for margins as the competitor has a 230 sq mm GPU selliing in the same price range. AMD designed Tonga to serve 2 purproses - as a test bed for architectural improvements and to be able to sell a 256 bit GPU with Tahiti like performance for Apple's notebook requirements as PCI-E MxM is restricted to GPUs with 256 bit memory bus.

http://en.wikipedia.org/wiki/Mobile_PCI_Express_Module#2nd_generation_configurations_.28MXM_3.29

AMD needs new chips specifically targetted at good perf/watt and perf/ sq mm at the right price points. Thats why I strongly believe AMD needs new chips across the stack, More importantly these chips should be competitive with Maxwell at perf/sq mm as thats how AMD can improve margins.

R9 390X Bermuda XT - 4096 sp, 4 shader, 4 raster engines and 4 tesselation engines, 64 or 128 ROPs, 8 ACE, 8 GB HBM, <= 550 sq mm
R9 380X Fiji XT - 3072 sp, 4 shader, 4 raster and 4 tesselation engines, 64 ROPs, 4 or 8 ACE, 4 GB HBM, <=400 sq mm
R9 370X Trinidad XT - 1536 sp, 2 shader, 2 raster engines, 2 tesselation engines, 32 ROPs, 2 ACE, 4 GB HBM, <= 250 sq mm

This GPU stack will allow AMD to realize good margins and I expect this to be their next gen GPU stack.

btw the majority of GPU volume in 2016 will be on the 28nm node. Both the GPU vendors are unlikely to ship their first 16/14nm GPUs before mid-2016. Even then the 16/14nm ramp will take many quarters and the overall volume of 28nm GPUs sold in 2016 will be much higher than 16nm / 14nm. I am guessing the ratio will be 3:1 for the total year. For H1 2016 28nm will be almost 90 -100% of GPU volume. By Q1 2017 I expect 16/14nm GPUs to overtake 28nm GPUs in volume. So you want AMD to bleed another 18 months with an outdated GPU stack. What an illogical statement.

You have to realize that there is a race for 16/14nm capacity and not everyone is going to get sufficient volume in 2016.

http://www.anandtech.com/show/9180/...results-strong-q2-but-lower-forecast-for-2015

"Qualcomm actually had negative cash flow for the quarter. The already mentioned fine paid to China accounted for some of it, and Qualcomm also performed a prepayment of $950 million to secure long-term capacity from one of their suppliers. "

Neither Nvidia nor AMD can afford to pay that kind of money for securing long term leading edge capacity :biggrin: Apple, Qualcomm will get the lion's share of TSMC/Samsung 16/14nm wafers for 2016. The rest like AMD, Nvidia and Mediatek will be fighting for leftovers. In fact TSMC in one of their earning call last year stated that 16FF and 16FF+ combined will overtake 28nm capacity only in 2017.
 
Last edited:

at80eighty

Senior member
Jun 28, 2004
458
5
81
guess OP wasn't invited to the super secret AMD intergalactic webinar - truth is it is going to be powered by cryocooled nanosized quasars. My imaginary source can beat up your imaginary source
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
But imagine if R9 370X = 270X re-badge and sells for $109-119, well that's 43% faster than the $120 GTX750Ti. That means if you want a budget gaming card, you will not be able to beat this type of value.

The reason why the GTX 750 Ti is so popular is that it has very low power requirements and doesn't need a PCIe connector. Therefore, you can slap it in any crappy OEM system and get acceptable performance in most games. AMD can't match that with Pitcairn as-is. They would need an architectural respin and/or die-shrink to 20nm to do it. Pitcairn isn't selling well now, and won't sell well at $109-$119. Besides, it's not clear that a price point that low would even be profitable for AMD. Keep in mind that GM107 is a 148 sq. mm chip, while Pitcairn is 212 sq. mm. Also, Pitcairn has a memory bus twice as wide.

One can argue that because NV's sub-$300 cards are so weak in performance, AMD can almost get away with straight up re-badging everything from Pitcairn to Tonga XT. A $199 Tonga XT with 2048 cores and 1Ghz clocks would wipe out GTX960. This is a risk strategy though since consumers for the first time in decades are foregoing massive performance advantages AMD offers in favour of perf/watt. So it's possible this strategy will backfire badly because even a card 50-60% faster than the 960 4GB doesn't sell as well.....

AMD will be crucified by the review sites for rebadging this many items, and rightly so. If they do it, it's basically an admission of failure - that they can't compete and have to blow out their inventory at fire-sale prices. We've seen this death spiral on the CPU side, and it would be very discouraging to see it on the GPU side of things as well.

At the very least, they need a new chip to replace Pitcairn - ideally one that can run without a supplemental power connector.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Got to love all the sarcastic comments here.
Nobody have anything that can disprove it, not even an arguement yet people are certain 20nm wont happen.

Way to be closed minded.
If AMD can make smaller and more efficient chips on 20nm, thats a way to make up for the rebranding thats suppose to happen on everything below R9 390.
Right now AMD have nothing to counter Maxwell with on mobile because Tonga is a 125W chip. If they can manufacture that chip on 20nm, up the clocks, suddenly it doesnt look so bad.

It may be a way of using less resources to beat/match Maxwell instead of spending thousands of man hours on engineering and money on a new architecture on 28nm.
While prepping up for 16nm next year which is where GCN 2.0 will be released.
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Oh please stop it with the stupid examples. I said nobody have any arguments to why 20nm will never happen.
Which means they have no basis to their disbelief that 20nm will never happen.

Hive mind and unreflected outbursts, thats what it is.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Oh please stop it with the stupid examples. I said nobody have any arguments to why 20nm will never happen.
Which means they have no basis to their disbelief that 20nm will never happen.

Hive mind and unreflected outbursts, thats what it is.

We know what you said. You, however, don't seem to understand that you are using logical fallacy.
 
Aug 11, 2008
10,451
642
126
That's not how the burden of proof works. This isn't Sunday school.

Exactly. If someone is posting a claim that goes against what is the generally accepted information (that there would be no 20 nm dgpus), it is the burden of proof of the one making the claim to cite credible sources proving the contrary.

Not "well I have a top secret, inside source, but I wont tell you who it is. It is up to you to prove me wrong." Whether the information is correct or not, it is not surprising that a post of this tone is met with questioning and sarcasm.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Oh please stop it with the stupid examples. I said nobody have any arguments to why 20nm will never happen.
Which means they have no basis to their disbelief that 20nm will never happen.

No basis? This is not really correct. There have been several articles posted indicating that 20nm GPUs are not going to be released, and that the process is only suitable for small, low-power chips.
Fudzilla: 20nm node broken for GPUs
Fudzilla: 20nm GPUs not happening
ExtremeTech: AMD, Nvidia both skipping 20nm GPUs [...]
Digital Trends: AMD may skip 20nm production, head straight to 16nm FinFET

On the other side, there was a public statement by Lisa Su indicating that AMD would be making some kind of 20nm products at some point in 2015. But this could be anything - die shrunk cat cores, for example.

I did a search for 20nm APU rumors, and all of them seem to come from 2014. In contrast, the "20nm not happening" reports are mostly from 2015. If a console APU can be done on 20nm, then a discrete GPU probably can, too. But the newer reports indicate that AMD experimented with 20nm and wasn't happy with the process. So the older rumors may have correctly indicated that they tried, and the new rumors may also be correct in indicating that they couldn't make it work. I just hope they didn't throw too much time and effort at a failed die shrink at the expense of architectural improvements that may be the only way forward if they're stuck on 28nm.

One thing is for sure: letting the GPU product line stagnate for 18 more months is not an option. AMD will be a joke by then, a punchline, if they come to market with nothing and try to ride the Pitcairn horse into late 2016.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Oh please stop it with the stupid examples. I said nobody have any arguments to why 20nm will never happen.
Which means they have no basis to their disbelief that 20nm will never happen.

Hive mind and unreflected outbursts, thats what it is.

I will try to explain why it wont happen as quickly as possible,

The only one that has a 20nm right now is TSMC, and that is 20nm(SOC) meaning it is tailored for the Low-Power devices.
The performance/power curve of the 20nm SOC is not better than the 28nm HP at high frequencies. The 20nm SOC was made as an upgrade option over the 28nm LP.
So it is worthless to make a 20nm GPU because it will not give higher perf/watt from the process but only a die area reduction and higher cost(re-design, lower yields, higher wafer cost).
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Another unfounded rumor from OP from crappy sources and sites that YOU have to disprove, lulz.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I've registered just to post this:

http://www.chiphell.com/thread-1196441-1-1.html

They were completely spot on with performance of Fiji GPU and Titan X GPU, and also were talking about Global Foundries 20nm process for next gen AMD GPUs.

Looks like they were really well informed.

we dont even know fiji's performance, and they actually understated GM200 performance. Not that it wouldn't have been hard to guess how GM200 would perform as it was clearly going to be 50% larger than GM204
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Simplest answer: drivers.

The post was mid December last year. Titan X was realeased months later.

Everything else was spot on: Fiji - Water-cooled GPU, 65% faster than R9 290X. If we believe the slides that were few months later leaked to the internet...
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Simplest answer: drivers.

The post was mid December last year. Titan X was realeased months later.

Everything else was spot on: Fiji - Water-cooled GPU, 65% faster than R9 290X. If we believe the slides that were few months later leaked to the internet...

We don't even know the performance of Fiji yet.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
You don't prove negatives.

Funnily enough this is a negative statement. You can't prove existential negatives assuming the conditions are broad enough that non-existence would be impossible to observe -- I can't prove there isn't a giant flying spaghetti monster somewhere in space, but I can observe that there are no unicorns in my room. Of course, there are many negative mathematical proofs.

That said, as soon as the OP claimed "Nobody have anything that can disprove it, not even an arguement yet people are certain 20nm wont happen" he started arguing from ignorance, a logical fallacy.
 
Last edited: