Discussion RDNA 2 6nm Refresh

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GodisanAtheist

Diamond Member
Nov 16, 2006
6,815
7,171
136

- 6nm would bring 18% higher density (so more dies per wafer) + some performance or efficiency improvements as AMD chooses to tune the dies (Performance for desktop/Efficiency for laptops I suppose).

My understanding is that 6nm is a clean shrink of 7nm without the complexities and overhead of a full node shrink.

Wonder if AMD will end up running a couple simultaneous lines: 7nm for their 6000 series mainstream and bulk cards, 6nm for their mid to top end 6000 series S cards, and eventually 5nm for their RDNA 3 top end parts. Shift everything down a tier when 4/3nm whatever show up.
 

Ajay

Lifer
Jan 8, 2001
15,454
7,862
136
The MSRP of the 3090 is $1,499.99. The 6900 XT is as good as or better than the 3090. The only area it is really behind in is in ray tracing. IMO, it is a fair price for such a high end card. You can get cheaper cards, it just depends on your budget. You should get ready, because prices will never come back down to levels where they are supposed to be.
I haven't traditionally bought top end cards (not in a long time). So an NV x70/x80 class GPU is typically what I buy. The current prices are jacked through the roof. I do expect cards to keep going up because of the use of advanced lithography costs and the expensive of very high speed GDDR memory (and more of it starting to make a difference). So, if the prices don't drop down to expensive, but reasonable prices - I don't know what I'm going to do - stop buying resource heavy games I suppose. Nothing against AMD, just haven't bought one since the original Radeon, which was an absolute disaster - went to NV and never turned back. At least they have competitive cards again. Maybe they best NV with RDNA3 - it will be an interesting battle.

Edit: I currently have a 60Hz, QHD monitor - so that helps in keeping more resource intensive games within reach. Hitting 144Hz+ on my 1070 would not be possible.
 

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
I could see AMD doing this if RDNA3 is late Q4 or Q1, but I wouldn't put money on it. Several prerequisites will need to be present. They need to actually save significant physical size of the chip (cost) which AFAIK is not a given because of the big cache that doesn't shrink that easily. The actual design job is not supposed to be hard in this case but what do I know. Also its a game with regards to competition with NV, they might or might not want to put out new chips that doesn't have improved RT or better DLSS like capabilities, which could make them look outdated.
Wasn't there rumours that TSMC were keen to push 6nm for their 7nm customers?

While wafer prices are high and have increased, is it possible that TSMC have offered some incentives for customers to adopt 6nm?

I though the biggest difference between their 7nm and 6nm was that the later has more EUV steps and hence less steps and shorted lead times. That is, 6nm moves through TSMC's fabs faster, so potentially better for TSMC and it also means more wafers can be finished.
 

coercitiv

Diamond Member
Jan 24, 2014
6,201
11,903
136
I agree, but my thoughts are they won't return to yesterday's prices.
On this I agree, though I still insist a big portion of the price increase is artificial. I'll expand the food supply example, so everyone can understand why:

One year, a certain type of crop yielded particularly bad results in Europe. Factories used a mix of local and imported supplies, so naturally they attempted to compensate with more imports and announced distributors of increased prices. But who saw the opportunity first? Realizing Europe's crops were really bad, some scalpers decided to bet a lot of money and bought crops faster than legitimate actors. Prices rose sharply as businesses were scrambling to keep production going. There was a new middleman in the chain, asking for a big cut or else...

Back in my country, a newspaper was criticizing the government for the ever present inflation, using prices for goods and services as example. One such example was... the product described above. Next year Europe got good crops, and prices adjusted "back" to the real level of inflation. (not the same price obviously, the newspaper had a safe bet going).

For semiconductors, baring a breakthrough, which by definition is unforeseen, perf/$ costs seem to reached a low and are now rising due to increased transistor costs alone, even if you exclude the other factors in general inflation. Chiplets is a strategy to slow this trend.
Again I agree, but the current status quo has little to do with this trend. People are paying $200-300+ for cards built on previous nodes, old tech that by now has lower costs for production. Look at the GTX 1650, a 3 year old card. Launched at $149, now $300+ in stores. It should be clear that ever increasing semiconductor R&D has no strong correlation with this particular phenomenon, we're debating the global rise of sea levels while witnessing an earthquake induced tsunami.
 
  • Like
Reactions: Lodix

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
I haven't traditionally bought top end cards (not in a long time). So an NV x70/x80 class GPU is typically what I buy. The current prices are jacked through the roof. I do expect cards to keep going up because of the use of advanced lithography costs and the expensive of very high speed GDDR memory (and more of it starting to make a difference). So, if the prices don't drop down to expensive, but reasonable prices - I don't know what I'm going to do - stop buying resource heavy games I suppose. Nothing against AMD, just haven't bought one since the original Radeon, which was an absolute disaster - went to NV and never turned back. At least they have competitive cards again. Maybe they best NV with RDNA3 - it will be an interesting battle.

Edit: I currently have a 60Hz, QHD monitor - so that helps in keeping more resource intensive games within reach. Hitting 144Hz+ on my 1070 would not be possible.
x70 and x80 prices are high due to limited supply and tons of demand. I am surprised people aren’t snatching up the 6900XT. I don’t normally mine, but when I purchased my 3090 I started mining in my spare time until it was paid off. Even if I hadn’t done that, it is a great card. It pushes all my games up to 120-144hz at 5120x1440 or higher. It is a beast at ML as well.

I am sure the 6900XT is also a great card.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,815
7,171
136
because there isn't much of a difference to a 6800xt except in price.

- Even the prices are very very similar. I think its about a ~$100 difference in cost between the 6800XT (~$1500) and the 6900XT (~$1600). Frankly I'd personally go with the 6900XT in that case thanks to already overspending by a landslide, but still suffering a much smaller overall % price increase from MSRP (even if the original MSRP was kind of a joke in the first place with the 6900XT not really justifying its $1000 MSRP pricetag over the 6800XT's ~$649)).
 
  • Like
Reactions: Tlh97 and scineram

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
- Even the prices are very very similar. I think its about a ~$100 difference in cost between the 6800XT (~$1500) and the 6900XT (~$1600). Frankly I'd personally go with the 6900XT in that case thanks to already overspending by a landslide, but still suffering a much smaller overall % price increase from MSRP (even if the original MSRP was kind of a joke in the first place with the 6900XT not really justifying its $1000 MSRP pricetag over the 6800XT's ~$649)).
Not really, Asus 6800XT TUF was available for like whole day a couple of days ago at $1200, and it does regularly come in stock. I briefly entertained getting it and selling my 3070FE on ebay but ultimately decided not to as I'm just not sure it's worth the hassle.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Either Intel Arc is bad and no need for a refresh and/or it's simply not worth it if RDNA really launches in Q4 with "availabilty".

Seems like a refresh could be worth it if AMD used it for the mid-range chips since any RDNA 3 GPUs we get this year are going to be the top end parts. Better to get max value out of those 5nm wafers since we already know GPUs aren't as profitable as using those wafers for more CPUs.

If they devoted more 5nm wafers to those top end parts and used the freed up 7/6nm wafers for more midrange products they could kill two birds with one stone. Of course they don't need to do a 6nm refresh either since they can just keep the same 7nm design. Really their main problem has been just having enough product which a 6nm refresh doesn't change much in the grand scheme of things.
 

jpiniero

Lifer
Oct 1, 2010
14,599
5,218
136
Seems like a refresh could be worth it if AMD used it for the mid-range chips since any RDNA 3 GPUs we get this year are going to be the top end parts. Better to get max value out of those 5nm wafers since we already know GPUs aren't as profitable as using those wafers for more CPUs.

Course if they are going to start at a grand+ for RDNA3 GPUs, there's only so many people willing to pay that much for a gaming card.
 
  • Like
Reactions: Tlh97 and Hotrod2go

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
Really their main problem has been just having enough product which a 6nm refresh doesn't change much in the grand scheme of things.
But wasn't the whole reason TSMC was so keen to push 6nm for 7nm customers because it has less steps and raised throughout.
Same design rules (but obviously not mask set) makes it easier for TSMC's customers, higher throughout should mean more wafers. All TSMC then had to do was price 6nm a bit lower than 7nn (possibly by raising 7nm wafer prices?).
Now moving all Navi 2x dies to 6nm mighty not make sense but Navi 23 and 22 might if they plan is for RDNA3 on 5nm to take the high end and for 6600 and 6700 to be rebranded and spun for the lower end. Unless SKUs move a lot and the 6nm version of Navi 21 ends up being able to supply 7600 and 7700.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
It would make far more sence a Cezanne/Barcelo refresh on 6nm than RDNA2, considering they are going to keep using these APU as low cost options in 2023 and maybe 2024.
 

Saylick

Diamond Member
Sep 10, 2012
3,162
6,385
136

Well well, RDNA3 N31 and N32 on 5nm and 6nm. N33 on 6nm. I'm surprised MI300 is on 6nm as well.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Hmm, this, and the previous diagram tells me that Navi 3x is going to look a lot like the Ryzen lineup. Up to two 5nm dies with all the compute elements linked to an IO die on a trailing node. The low cost one is monolithic on a leading node.

I guess my question is, does NAVI31 start out with using recovered Navi 30 3 complex CCDs, or does it get a bespoke 2 complex die of it's own? I guess it'll probably continue to look like it does now, with the 79 using fully functional ones, the 78 using recovered parts, the 77 using fully functional 31 die, with the 76 using recovered, and finally the 75 using another repurposed mobile die that's missing a bunch of important stuff with limited IO.
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136

Well well, RDNA3 N31 and N32 on 5nm and 6nm. N33 on 6nm. I'm surprised MI300 is on 6nm as well.
I'm curious. Does anything happen to an employee that shows info like this?
 

maddie

Diamond Member
Jul 18, 2010
4,740
4,674
136
What do you think?

Leaks are, mostly, intentional. No one is dumb enoght to leak info and risk not getting hired ever again in the industry.
Who's talking about leaks.

Also, never underestimate human stupidity, even from those who should know better.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
But wasn't the whole reason TSMC was so keen to push 6nm for 7nm customers because it has less steps and raised throughout.
Same design rules (but obviously not mask set) makes it easier for TSMC's customers, higher throughout should mean more wafers. All TSMC then had to do was price 6nm a bit lower than 7nn (possibly by raising 7nm wafer prices?).

The problem is that to get those advantages they need the new mask set. There's a cost to that and they may not intend on producing enough additional dies to make the extra cost worth it as opposed to just continuing to use the existing 7nm masks.

Navi 24 is already using 6nm so it's entirely possible that's the only chip that will see a long term production on the node and everything else above it is getting moved over to 5nm as soon as possible. There's probably a good argument for doing an N23 refresh on 6nm given it's a high volume part as well and wouldn't likely see a 5nm replacement for several months after the top cards launch.

I'm expecting that pricing will increase. Unless mining completely collapses before the end of the year and the market is awash in cards, plenty of people will gladly spend $1,000 on a top tier card if it were actually available at that price. A few lucky souls may have gotten their 3080 or their 6800 XT at or close to MSRP, but most people had to pay almost double, if not more. A $1,000 7800 XT might look like a really good deal in comparison to where things have been at. For Nvidia cards they'd just be returning to Turing prices and we already know plenty of people bought those.
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
Either Intel Arc is bad and no need for a refresh and/or it's simply not worth it if RDNA really launches in Q4 with "availabilty".
It may launch earlier than Q4.
Seems like a refresh could be worth it if AMD used it for the mid-range chips since any RDNA 3 GPUs we get this year are going to be the top end parts. Better to get max value out of those 5nm wafers since we already know GPUs aren't as profitable as using those wafers for more CPUs.

If they devoted more 5nm wafers to those top end parts and used the freed up 7/6nm wafers for more midrange products they could kill two birds with one stone. Of course they don't need to do a 6nm refresh either since they can just keep the same 7nm design. Really their main problem has been just having enough product which a 6nm refresh doesn't change much in the grand scheme of things.
It sounds like they are doing exactly that. 5nm = “premium”. 6nm for everything else.
Course if they are going to start at a grand+ for RDNA3 GPUs, there's only so many people willing to pay that much for a gaming card.

The number of people willing to drop more than a grand for a high end card is much higher than you think. Many older gamers are also successful from a career standpoint. $1,000 is less than 10% of my monthly discretionary income, and as gaming is my hobby I have no issues buying a new GPU even if it was only for gaming. Anyone with a decent middle to upper class income can afford to buy a decent high end GPU every few years.
 
  • Like
Reactions: Tlh97 and Mopetar