Question Speculation: RDNA2 + CDNA Architectures thread

Page 226 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,626
5,926
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
Well, technically 2K is 2048x1080, just like 4K is technically 4096x2160. But 1080p is the closest equivalent, just as UHD/3840x2160 is the closest equivalent to 4K.

In other words, whatever website you're trying to link is just wrong.

A different discussion topic than this one.

You can have your own opinion, but 2k monitors are also QHD. The internet at large disagrees with you. Feel free to check it out.
 
Last edited:
  • Like
Reactions: Krteq

sze5003

Lifer
Aug 18, 2012
14,182
625
126
I always thought 2k was 1440p but then you have also 3440x1440p which I've also see thrown around as 2k. Then the 2560x1080p which technically is still 1080p just wide.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
If you want to use a 'K' for 1440p, call it 2.5K

Nah, I won’t. Because that’s made up and no one knows what that means. If you want to be pedantic and use that on the Internet go for it.

Everyone else knows it references resolutions above 1080p and below 4K and most commonly 1440p. Because it’s roughly half the pixels of 4K. You’ll love this, because sometimes 1k is thrown around as describing 1080p. Because it’s a quarter of 4K. It’s used more than 2.5k anyway 😂

It’s possible to be technically correct but wrong in general usage.

You can start a poll thread or something if you want.
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
RX6700XT is 25-30% faster than a 5700XT, but this is in average. Of course individual games will change a lot, and it's clear AMD was showing their best cases. Other games will see quite lower advantage, if any. SubstantiallY 6600XT will be almost on par with 5700XT at 1080p, performance wise, having almost teh same die size on the same processm while costing 20$ less and consuming a lot less. Not a good bargain, not a complete disaster either, unfortunately the situation with the supply is not good and there are even components other than the GPU costing way more than past year, i.e. GDDR6 VRA, is expected to grow in price 8-13% this quarter only, after several quarters of pricing increase. Of course this is not saying the 6600XT seems to be a better value compared to a 3060, but as said it is not completely out of the market either.
Problem is that 5700XT was hardly a bargain either. Compared the previous AMD cards and even compared to 2060 (and the 2000 series was overpriced even for Nvidia).

Okay, new nodes no longer deliver much if any improvement in cost per transistor (and of course 5700XT was already on 7nm), but manufacturers also have to offer something or lots of people like me can just go without. Others will with consoles.

Anyway, while die costs have gone up and stayed high, a 236mm² at 0.09 defects per cm² (something TSMC quoted good while ago for 7nm) should yield around 190 good dies (and not all of the 44 defects would be entirely useless).
Taking three possible wafer prices ($10k / $15k / $20k - where the first a bit low and the last is more like 5nm prices), makes the die costs either $53 / $64 / $106.

So in terms of profit per wafer, Navi 23 is actually better for AMD than Navi 22.

The 128-bit bus should make the rest of the components relatively cheap too.

Low-end parts are supposed to have lower margins and make it up with volume. This part doesn't do that.
 
  • Like
Reactions: Ranulf and maddie

JujuFish

Lifer
Feb 3, 2005
11,003
735
136
Nah, I won’t. Because that’s made up and no one knows what that means. If you want to be pedantic and use that on the Internet go for it.

Everyone else knows it references resolutions above 1080p and below 4K and most commonly 1440p. Because it’s roughly half the pixels of 4K. You’ll love this, because sometimes 1k is thrown around as describing 1080p. Because it’s a quarter of 4K. It’s used more than 2.5k anyway 😂

It’s possible to be technically correct but wrong in general usage.

You can start a poll thread or something if you want.
Except 4K is 4x the pixels of 2K. And 8K is 4x the pixels of 4K. You are wrong and are too embarrassed to admit it.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
Problem is that 5700XT was hardly a bargain either. Compared the previous AMD cards and even compared to 2060 (and the 2000 series was overpriced even for Nvidia).

Okay, new nodes no longer deliver much if any improvement in cost per transistor (and of course 5700XT was already on 7nm), but manufacturers also have to offer something or lots of people like me can just go without. Others will with consoles.

Anyway, while die costs have gone up and stayed high, a 236mm² at 0.09 defects per cm² (something TSMC quoted good while ago for 7nm) should yield around 190 good dies (and not all of the 44 defects would be entirely useless).
Taking three possible wafer prices ($10k / $15k / $20k - where the first a bit low and the last is more like 5nm prices), makes the die costs either $53 / $64 / $106.

So in terms of profit per wafer, Navi 23 is actually better for AMD than Navi 22.

The 128-bit bus should make the rest of the components relatively cheap too.

Low-end parts are supposed to have lower margins and make it up with volume. This part doesn't do that.

In this market there doesn't seem to be a reason too. I mean, in every NewEgg shuffle the 3060 AIB cards are $500-$600.

So this has a MSRP or $379. It will list from $500-$700 I am sure. The value of it vs a 5700xt which out of warranty and old is... (checks eBay)....

Last 10 sold functional 5700xt cards sold on eBay as of right now is $771.

So, price relative to the card it's replacing seems to check out. Cheaper, lower power, faster at 1080 and we'll see at higher resolutions.

1627668622753.png

(Obviously truncated from my math, but reasonably representative)

Honestly, I don't think the MSRP even matters. It's like someone in the marketing department shrugged and they picked a number they wished they could value it at knowing the street price is completely decoupled from MSRP. The least important detail about the launch, IMO. Also, sadly.

Can't buy a 2060 Super for for $379 either.

1627669774585.png

That first one? That's for parts.
 
Last edited:
  • Like
Reactions: prtskg and Tlh97

ModEl4

Member
Oct 14, 2019
71
33
61
Please don't buy this product unless you can find it at 3060 street price or less!
The 15% performance lead vs 3060 is misleading, with the advertised clocks it will be around 10% at 1080p:
3060ti 8GB : 114 (+14% faster)
6600XT 8GB : 100
3060 12GB : 90 (-10% slower)
The price should have been $350 max:
3060ti 8GB : $350+14%=$399
6600XT 8GB : $350
theoretical 3060 8GB : $350-10%=$315
released 3060 12GB $330 is much better value than a $315 8GB theoretical 3060 model...
All the above without taking account Nvidia's advantages, which should have lowered even more the proposed $350 6600XT price- based only on traditional raster performance differences.
Also 5700G is launching in August at $359, after 3 months we will have an i5 12600KF at $237? with the same performance in games AND also same multithreading performance and if Intel launch this year (probably CES 2022) the low end DG2 we will probably have much better gaming performance at $237+$99=$336
With these prices, i wonder what price AMD will have for the 16CU 4GB part and I will like to compare it with the Q2 2016 $199 RX 480 4GB part to see what performance (15%) and memory (0) we gained after 5.5 years...
 

Ranulf

Platinum Member
Jul 18, 2001
2,349
1,172
136
The ultimate in 1080p gaming is here again! This time not at $279 for a 5600XT but $379 for a 6600XT! At least its only 6 months late instead of a year this time.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,064
136
www.teamjuchems.com
Computerbase 6600XT review :


7-11% faster than the 5700XT at 50% better efficency.

Looks like at WQHD it's essentially tied with the 5700xt at whatever improved efficiency. Which doesn't seem like a bad showing at all, I used my 5700xt for WQHD gaming and was really happy with it.

A mix of High/Very High and the rare medium/low (BL3 smoke effect, looking at you) usually allowed for ~90+ FPS gaming.
 
  • Like
Reactions: lightmanek

GodisanAtheist

Diamond Member
Nov 16, 2006
6,808
7,162
136
Here is another one for the 6600XT. This would have been an absolutely killer generation for AMD, the incredible swing from not being able to catch an ancient 1080TI to matching and beating it with the x6xx line is absolutely stunning, if it were not for the stupid street (and even MSRP) prices on these things...

 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Looking at the performance and power consumption the card is good, it is faster than the 3060 that (NORMALLY) would be its direct comparison and it is faster than its predecessor, the RX5600XT.

As for the price, well, petty sure everyone knows what i was going to say, my question would be, is someone actually suprised by the price? Naming alone should tell you that you should compare it to the RX5600XT not the RX5700XT, this is not replacing the 5700XT, what makes things a lot worse.
With the GPU market being in a total disarray they are free to do whatever they want. And the MSRP increases started way before the pandemic and the mining craziness when they did the jump from Polaris to Navi.

BTW, i just saw that x8 thing WTH... they give Cezanne APU a x16 3.0 PCI-E upgrade so the new APU end up being better with dgpus than before, BUT, at the same time they push the x8 limit into the RX6600 cards, this is not even funny anymore.
 
Last edited:
  • Like
Reactions: Tlh97

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Bah, so its the 5500XT drama all over again with x8 pci-e. Lame.

Ok here it comes, petty sure eveyone saw this coming.
It is a lot worse this time, the x8 limit is pushed into the RX6600XT, when last time it was limited to the RX5500XT that was entry-level. AMD has enforced the PCI-E 3.0 limit on A520 and B450 (as i said back then, and IT happened, most OEMs launched revised versions of B450 that are better suited for Ryzen 5000, as least some of those might had PCI-E 4.0 support whiout that artificial block). This is BS when you realise that anyone with a A320 + Ryzen 3000 + beta bios might have better RX6600XT experience than someone else with a A520/X470/B450 whiout beta bios. In fact i would like to see if someone test that long term to see if what AMD said back then was true or not, i think there is a good enoght excuse now.

Not to mention that the "fix" to USB issues on X570 was... DISABLE PCI-E 4.0!!! That was ever fixed?

Everything they did related to the PCI-E bus only harm their own processors/gpus, incluiding Renoir/Cezanne that now have a x16 PCI-E 3.0 that cant use on this card.

And the reason for the x8 bus is to make one die for both desktop and mobile, they went to make this as cheap as possible with the 128bit bus and 32MB IC while giving it a premium MSRP on it at the same time, for a 1080p card. 1080p is not in any way shape or form, premium. This is a mining card, 32MH/s @ 55W.
 
Last edited:
  • Like
Reactions: Ranulf

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,991
136
Doesn't the smaller bus hurt the mining potential a lot though? Sure it runs efficient at low clocks on the cores, but Etherium is bound by memory bandwidth and the 128-bit bus makes a pretty hard cap on performance.

The 3060 can do around 50 MH/s and costs less. Of course that assumes you've got one that isn't running the crippled drivers Nvidia tried to use to stop miners.

Maybe the equation shifts a bit due to better long term efficiency and therefore profits, but the market price for cards has largely correlated to MH/s.

If AMD could find a way to take the concept of IF even further it would go a long way to curtailing the price gouging on their cards.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Doesn't the smaller bus hurt the mining potential a lot though? Sure it runs efficient at low clocks on the cores, but Etherium is bound by memory bandwidth and the 128-bit bus makes a pretty hard cap on performance.

The 3060 can do around 50 MH/s and costs less. Of course that assumes you've got one that isn't running the crippled drivers Nvidia tried to use to stop miners.

Maybe the equation shifts a bit due to better long term efficiency and therefore profits, but the market price for cards has largely correlated to MH/s.

If AMD could find a way to take the concept of IF even further it would go a long way to curtailing the price gouging on their cards.

Yes the memory bus does affect it, but the 3060 (the unlocked ones with the """leaked""" driver), cant be used on mining rigs, in order for the unlocked driver to work the gpu needs to be pluggled into, a at least, x8 PCI-E bus, anything lower and the card is locked again (rigs use x1 risers), even with the driver, you also need to use a dummy hdmi plug but thats not an issue. Altrought you can still mine alt coins, miners arent that interested in the 3060 or any other LHR card, yet, if RVN prices keep increasing that will change.

If the card has a good MH/s to Watts ratio the miners will use it, and if those 32@55W are real, this is a very good mining card.