Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 86 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
There was an article of how NVidia is planning on artificially creating shortage by limiting volume of 4090 chips.
NVIDIA focuses on H100 graphics cards The news said that the RTX 4090 is out of stock, and the price will rise until next year - fast technology - technology changes the future (mydrivers.com)

With Zen 4 V-Cache, it would be a natural shortage, if AMD releases it as soon as the first batch is packaged. Instead of waiting for a day when these chips could be launched in full volume.

What AMD could have learned (but did not) from the Zen 3 V-Cache fiasco is that for every day the chip is released after its competition (which was ADL), its irrelevance increases. When it finally launched in March - April, Zen 3 V-Cache achieved nearly full irrelevance.

PS: Low volume (with shortages) of Zen 3 V-Cache was possible in last December. But AMD, instead, sent those chips to Microsoft Azure cloud.
The prioritizing H100 has more to do with the coming China sanctions. They won't be able to sell anymore after the cut-off date.
 

H T C

Senior member
Nov 7, 2018
588
427
136
Also some people in Europe forget we usually have a sales tax between 18-25%, which is why graphics cards are more expensive than in US.

Screenshot from 2022-10-23 21-14-57.png

From AMD's Portuguese site: out of stock, though.

Portugal's tax is 23%, which would up the $799 price to $982.77. However, and as seen in the pic ... it's a slight more than that ... and it's not the $ to € conversion since they are virtually the same ...
 

Joe NYC

Platinum Member
Jun 26, 2021
2,537
3,468
106
View attachment 69739

From AMD's Portuguese site: out of stock, though.

Portugal's tax is 23%, which would up the $799 price to $982.77. However, and as seen in the pic ... it's a slight more than that ... and it's not the $ to € conversion since they are virtually the same ...

Wow, that's a huge difference from the $799 price in the US.

But AMD has been updating the US prices very recently. Last few days. So, it is possible AMD will adjust European prices as well, and get more cards in stock, if AMD really still has a large number of N21 chips in stock.
 

Joe NYC

Platinum Member
Jun 26, 2021
2,537
3,468
106
The prioritizing H100 has more to do with the coming China sanctions. They won't be able to sell anymore after the cut-off date.

NVidia does not have a shortage of wafers, the opposite. NVidia was negotiating with TSMC to delay its N5 wafer allocation.

So, there is not trade-off between the two segments of products (consumer, datacenter). NVidia has more than enough wafers for both.
 
  • Like
Reactions: scineram and Leeea

Joe NYC

Platinum Member
Jun 26, 2021
2,537
3,468
106
Speaking of amd.com website, I just noticed some nice bundle discounts, going up to $450 discount for buying CPU+GPU bundle.

I think I will be watching the amd.com when I am ready to build my new system, which will likely be 7800x V-Cache + 7800xt + B650E

1666561629134.png
 

amenx

Diamond Member
Dec 17, 2004
4,107
2,379
136

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
Companies will always release the best product first, unless that is a total dud (Hey Intel).
Dumb Nvidia, releasing the 4090 and 4080 first.

IMO, the most annoying thing that AMD's marketing is doing is selecting stupid dates for releases - for example 7/7 for the 7nm Ryzen 2 and RDNA1, or November 3rd because of RDNA3. Just release it!!!

I miss the olden days when launches were launches, and people would discuss if it's a paper launch or hard launch. Now everything is just announcements of announcements. You wait for November 3rd but the cards are only really available weeks later. These days with AMD and Nvidia selling directly to consumers through websites - it should have been easier to hard launch stuff.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
NVidia does not have a shortage of wafers, the opposite. NVidia was negotiating with TSMC to delay its N5 wafer allocation.

So, there is not trade-off between the two segments of products (consumer, datacenter). NVidia has more than enough wafers for both.
Throughput?

When Nvidia wanted to reduce their wafer commitment, do we know over what period of time? To sell these H100 die they must finish starting the last wafers several months before the sanctions begin and they would want to sell the full Chinese orders for the lifespan of the generation, so probably bringing forward late 2023 & 2024 orders.
 

Yosar

Member
Mar 28, 2019
28
136
106
You just nailed It.
3840-2160.png

No. Simply no. 4090 was not held back by anything. It's a 4K. Any modern processor is enough.
If processor is slower it's slower for both cards. Not for one.

You can even compare data for the same games on both sides. Let's look at them.


GameTPU benchmark - 6950 XT/4090 (5800X)Computerbase benchmark - 6950 XT/4090 (12900K)Difference for TPU / Difference for Computerbase (between cards with different processors)
CP2077 39 / 71.237 / 67.5 82.5% / 82.4 % (perfect)
Deathloop 72.5 / 143.160.6 / 121.697,3% / 100,6% (again almost perfect)
Doom168.5 / 305163.2 / 317.2 81,0% / 94,3%
Dying Light 258.1 / 100.155.7 / 100.372,3% / 80,00%
F1140.9 / 253 130.4 / 217.7 79,5% / 66, 95%
Forza Horizon98.9 / 142.2 71.9 / 125.343,8% / 74,26%


And so on, so on. Look at FH numbers, 5800X was able to generate 142.2 frames, so more than 12900K 125.3 (probably difference in settings but it matters for graphic card, not processor).
Some of the data are clearly outliers, so who's right? Well let's look at PCGH then because they had 20 games tested (with 12900K) and also some of those above.


Doom - 95% (more Computerbase)
Dying Light - 67% (definitely more TPU)
Forza Horizon - 79% (more Computerbase)

So nobody's right.
And in the end in PCGH tests 4090 is 56% faster than 6950 XT, so again definitely more TPU (53%) than Computerbase (82%). Actually with bigger number of games 4090 seems 'slower'.
Computerbase seems more and more outlier than not (I'm not surprised, they chose games like Shadow Warrior 3 instead of Resident Evil Village or Elden Ring, you can not take it seriously) .

Conclusion is, there is more variance because of selected place of tests than because of processor in 4k. And that's why cards/processors should be tested with as many games as it's possible.
With more games any outliers in results have less and less impact on the end result.
17 is clearly not enough.
 

biostud

Lifer
Feb 27, 2003
18,700
5,434
136
View attachment 69739

From AMD's Portuguese site: out of stock, though.

Portugal's tax is 23%, which would up the $799 price to $982.77. However, and as seen in the pic ... it's a slight more than that ... and it's not the $ to € conversion since they are virtually the same ...
Here in Denmark with 25% sales tax, they start at 842 euro or $830, they are readily in stock and you can get a liquid cooled card for around $860.
 
  • Like
Reactions: Elfear

H T C

Senior member
Nov 7, 2018
588
427
136
Here in Denmark with 25% sales tax, they start at 842 euro or $830, they are readily in stock and you can get a liquid cooled card for around $860.

So ... the exact same price ...

It seems Portuguese AMD's site hasn't updated their price yet, judging from the fact that there are cheaper 6900X available:

Screenshot from 2022-10-24 05-57-44.png

Never thought i'd see the day when an ASUS card is the cheapest ... they are usually 100€+ more expensive than the rest ...
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,166
7,666
136
I think the the 4090 was held back by the 5800X they used @TPU which is why the do not have so much improvement.

- It's wild a year ago I was considering picking up a relatively cheap 10700k and was directed to all the data showing AMD demolishing that chip with the 5800x.

Here we are now a year later and the 5800x is holding back next gen GPUs.
 

biostud

Lifer
Feb 27, 2003
18,700
5,434
136
So ... the exact same price ...

It seems Portuguese AMD's site hasn't updated their price yet, judging from the fact that there are cheaper 6900X available:

View attachment 69764

Never thought i'd see the day when an ASUS card is the cheapest ... they are usually 100€+ more expensive than the rest ...
Except the price in Denmark is 842€ included sales tax. The cards are generally 150-200€ more expensive in Portugal than Denmark, weird.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
No. Simply no. 4090 was not held back by anything. It's a 4K. Any modern processor is enough.
If processor is slower it's slower for both cards. Not for one.

You can even compare data for the same games on both sides. Let's look at them.


GameTPU benchmark - 6950 XT/4090 (5800X)Computerbase benchmark - 6950 XT/4090 (12900K)Difference for TPU / Difference for Computerbase (between cards with different processors)
CP207739 / 71.237 / 67.582.5% / 82.4 % (perfect)
Deathloop72.5 / 143.160.6 / 121.697,3% / 100,6% (again almost perfect)
Doom168.5 / 305163.2 / 317.281,0% / 94,3%
Dying Light 258.1 / 100.155.7 / 100.372,3% / 80,00%
F1140.9 / 253 130.4 / 217.779,5% / 66, 95%
Forza Horizon98.9 / 142.271.9 / 125.343,8% / 74,26%


And so on, so on. Look at FH numbers, 5800X was able to generate 142.2 frames, so more than 12900K 125.3 (probably difference in settings but it matters for graphic card, not processor).
Some of the data are clearly outliers, so who's right? Well let's look at PCGH then because they had 20 games tested (with 12900K) and also some of those above.


Doom - 95% (more Computerbase)
Dying Light - 67% (definitely more TPU)
Forza Horizon - 79% (more Computerbase)

So nobody's right.
And in the end in PCGH tests 4090 is 56% faster than 6950 XT, so again definitely more TPU (53%) than Computerbase (82%). Actually with bigger number of games 4090 seems 'slower'.
Computerbase seems more and more outlier than not (I'm not surprised, they chose games like Shadow Warrior 3 instead of Resident Evil Village or Elden Ring, you can not take it seriously) .

Conclusion is, there is more variance because of selected place of tests than because of processor in 4k. And that's why cards/processors should be tested with as many games as it's possible.
With more games any outliers in results have less and less impact on the end result.
17 is clearly not enough.
That TPU chart(Link) clearly shown that RTX 4090 was held back even at 4K in some games.

I am looking at that comparison table of yours, and I must highlight some things.
Only 2 games had comparable difference between reviews and those are CP2077 and Deathloop, but those games are not limited by 5800x.
Actually, from those 6 games only Forza Horizon was limited. Just check out my comparison to see what is the difference in performance by using 12900k in those games. It's from the long TPU chart I linked, and you quoted.
Resolution: 4KCyberPunk 2077DeathloopDoom EternalDying Light 2F1 22Forza Horizon 5
5800x -> 12900K+1.8%+0.3%+1.4%+0.3%-0.6%+10.6%

Your claim that If the CPU is slower, then It's slower for both GPUs is also not true. If one GPU is ~1.8x faster than the other, then It's not possible for It to be limited by the CPU to the same extent as the weaker GPU is.
Here is the proof, although at different resolutions:
cyberpunk-2077-1920-1080.png
cyberpunk-2077-2560-1440.png


RTX 4090 lost only 5 FPS or 4% by increasing the resolution from 1080p -> 1440p.
RX 6900XT lost ~53 FPS or 38% by increasing the resolution from 1080p -> 1440p.
RTX 3090ti lost ~38 FPS or 27.5% by increasing the resolution from 1080p -> 1440p.
Why did RTX 4090 lose so little FPS compared to other cards? The only explanation is because It was held back by the CPU a lot more than the weaker GPUs.

Just because you don't like ComputerBase using Shadow Warrior 3 doesn't mean we shouldn't take It seriously.
I agree that the higher number of games tested the better, but then TPU would be the best out of the 3 reviews, because they test 25 games, but as shown they used a weak CPU.

The best thing would be If they reviewed the upcoming N31 with the fastest gaming CPU and that one should be 13900k.
 
Last edited:
  • Like
Reactions: Stuka87

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
The only explanation is because It was held back by the CPU a lot more than the weaker GPUs.
It is also possible the software was simply not coded to run that quickly. Wait timers, sync locks, frame caps, etc.

It is possible it could not transfer the frame data to the GPU quickly enough.

A GPU bottleneck not related to the CPU, like the scheduler.

A GPU bottleneck not related to the CPU, like the memory controller.


In short, it could be the CPU, but their are many other possibilities. Some more likely then others.


Example:
In some ways, Intel ARC shows similar scaling, and nobody is suggesting that is CPU limited. Many people blame the GPU's memory controller.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
It is also possible the software was simply not coded to run that quickly. Wait timers, sync locks, frame caps, etc.

It is possible it could not transfer the frame data to the GPU quickly enough.

A GPU bottleneck not related to the CPU, like the scheduler.

A GPU bottleneck not related to the CPU, like the memory controller.


In short, it could be the CPU, but their are many other possibilities. Some more likely then others.


Example:
In some ways, Intel ARC shows similar scaling, and nobody is suggesting that is CPU limited. Many people blame the GPU's memory controller.
That is certainly not the case with CyberPunk 2077.
By using 12900K instead of 5800x the RTX 4090 saw 31.1% increase in FPS in CP2077 at 1080p. TPU
That would mean this:
cyberpunk-2077-1920-1080.png
 

Timorous

Golden Member
Oct 27, 2008
1,748
3,240
136
That is certainly not the case with CyberPunk 2077.
By using 12900K instead of 5800x the RTX 4090 saw 31.1% increase in FPS in CP2077 at 1080p. TPU
That would mean this:
View attachment 69766

You are pointing at a single game though.

Given this is a speculation thread the data we have on hand is either vague (>50% perf/watt) or rumour so all we can do is try and use some napkin maths + historic definitions to get a ballpark for how the new products will perform 'on average' and what kind of TBP we think that can be done in. Given the accuracy / precision of the data we have trying to delve into the nitty gritty of a singular gaming example or trying to get error bars smaller than +/- 10% is just not going to happen. It should be assumed any estimates given in here have pretty huge error bars and are heavily caveated for this very reason.
 

Aapje

Golden Member
Mar 21, 2022
1,515
2,065
106
@eek2121

Allegedly, the Ryzen 7950X and 7900X are actually selling relatively well, but sales of 7700X and 7600X is very low. This indicates that those products are overpriced, or just as likely, the platform costs for those chips is too high (and the higher component costs of DDR5 and AM5 mobo's are relatively more significant for those builds).

I actually consider both 13th gen and Ryzen 7000 to be relatively poor generations, right now. 13th gen intel is basically an overclock, combined with more generous SKUs (more e-cores per tier). The main saving grace for 13th gen Intel is the latter and no increase in platform costs. I see neither as a particularly good generation to upgrade to for gamers right now if you have a decent system and want to upgrade when you get a really good benefit from it.

RDNA3 will hopefully be cheaper because AMD's costs are lower than NVIDIA's costs. However, they may choose to charge similar pricing to NVIDIA for healthier margins.

There is no way they can charge the same for similar performance. Radeon cards simply don't have the same inherent value as a platform. My expectation is lower prices than Nvidia, but still prices that are too high compared to their actual value. Then prices will fall to the level where AMD can find enough buyers.

The main question is how many cards AMD wants to sell, which mainly depends on how many chips they need for other products. Once they start having excess production volume, it makes sense to lower prices and/or introduce lower end SKUs.

EDIT: Two more quick thoughts here: first, at this point in time you cannot compare DDR4 pricing to DDR5 pricing anymore than you can compare Zen 3 to Zen 4 prices or Alder Lake to Raptor Lake. DDR4 is on clearance. The majority of future sales will be DDR5.

That's mostly irrelevant for someone who is contemplating buying now and not in the future. The benefit for current-gen DDR5 is small, much smaller than the price increase. Clearance prices are good, premium prices for products that will be superseded anyway are bad.
 

moinmoin

Diamond Member
Jun 1, 2017
5,064
8,032
136
Actually it works in the other way. It is only more profitable if AMD has costs in let's say EUR. And that's only if you don't take into consideration that those costs will be higher due to inflation also. And I doubt there are many costs they pay in EUR.
So strong $ means only one thing for AMD (or any corporation from US), their sales will tank in any market that currency was weakend in relation to $. Because it will be much more expensive there than it was.
And tanking sales means less profit not more.
Basically now most corporations from US has decisions to make. Will they pretend that 550$ for 7900X (the same as for 5900X) is good price in any market outside US.
As long as $ is strong it's not a good price. Of course it's not that simple, you cannot simply lower prices in let's say Europe just like that, because Americans will start to buy from Europe instead of American sellers.
And that's a big no for American corporations.

So it's quite quite delicate matter of balance. But if American corporations will act like $ is not strong (too strong for their good) their sales and profit will tank outside of US (tax for inflation + strong $ is too high there).

And AMD is only example. It concerns nVidia (I could buy here RTX 4090 no problem, no ques), intel, apple et cetera et cetera.

That's why China always tried to keep its currency artificially low in relation to $. It makes export/sales more profitable, no the other way.
It's a little more complex than this, being an American company doesn't mean all required transactions are done in US$. Instead it's preferred to do all local transactions in the local currency. The interesting question is what currencies are used when crossing specific borders. In AMD's case all dies nowadays are manufactured in Taiwan (by TSMC), and depending on products assembly and testing afaik happens either in Malaysia (by AMD Global Services) or through third parties in Taiwan and China. When to keep local currency and when to exchange it is a complex business of its own.

Except the price in Denmark is 842€ included sales tax. The cards are generally 150-200€ more expensive in Portugal than Denmark, weird.
Aside differing sales taxes the prices may be more of a reflection of the respective market size (affecting the amount of internal price competition). In Germany RX 6900 XT is available from 779€:
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
You are pointing at a single game though.

Given this is a speculation thread the data we have on hand is either vague (>50% perf/watt) or rumour so all we can do is try and use some napkin maths + historic definitions to get a ballpark for how the new products will perform 'on average' and what kind of TBP we think that can be done in. Given the accuracy / precision of the data we have trying to delve into the nitty gritty of a singular gaming example or trying to get error bars smaller than +/- 10% is just not going to happen. It should be assumed any estimates given in here have pretty huge error bars and are heavily caveated for this very reason.
A link to the TPU chart with 50 games is in my post you quoted. You can see that 31.1% is not even the largest increase there, the highest is in Halo Infinite(48%).

With the rest of your post I agree.:)
 

biostud

Lifer
Feb 27, 2003
18,700
5,434
136
It's a little more complex than this, being an American company doesn't mean all required transactions are done in US$. Instead it's preferred to do all local transactions in the local currency. The interesting question is what currencies are used when crossing specific borders. In AMD's case all dies nowadays are manufactured in Taiwan (by TSMC), and depending on products assembly and testing afaik happens either in Malaysia (by AMD Global Services) or through third parties in Taiwan and China. When to keep local currency and when to exchange it is a complex business of its own.


Aside differing sales taxes the prices may be more of a reflection of the respective market size (affecting the amount of internal price competition). In Germany RX 6900 XT is available from 779€:
Yeah I know, but I didn't know there were so huge difference within EU. If I lived in Portugal, I would definitely order from another EU country.
 

Aapje

Golden Member
Mar 21, 2022
1,515
2,065
106
Portugal has a 4% higher VAT than Germany. That €779 price is €655 without German VAT, which becomes €806 with Portugese VAT.

What I did notice is that the Dutch and Spanish Amazon have nearly the same price for the cheapest 6900 XT, but that is ~€900. The much cheaper Dutch prices are from more country-bound sellers. So that suggests that a lack of competition might be the issue.
 
  • Like
Reactions: Tlh97 and biostud