Discussion RDNA4 + CDNA3 Architectures Thread

Page 463 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
Mar 3, 2017
1,777
6,785
136
1655034287489.png
1655034259690.png

1655034485504.png

With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it :grimacing:

This is nuts, MI100/200/300 cadence is impressive.

1655034362046.png

Previous thread on CDNA2 and RDNA3 here

 
Last edited:

Timorous

Golden Member
Oct 27, 2008
1,966
3,850
136
They do have a full die with 16GB, it's called the 9600 XT 16GB.


Cut parts are mostly NOT salvage, because yields are generally quite good when mainstream parts hit mass production. So while there is some salvage in the cut down parts, they are mostly just disabled for product segmentation, and they could have sold as fully enabled parts.

For this die they don't even have a cut down part at all, so yields must be extremely good, and AMD is really loath to lower their margins with product segmentation.

Way to miss the point. The 8GB SKU is a waste of a fully functional die, they sit on shelves and sell for a lower price than the BOM saving so have lower margins to boot.

When you launch a products at the same time it is often due to segmentation. When you launch products at different times and perhaps have a limited run it is often salvage. Think 3300X and the 5600X3D or 7600X3D.

There is no cut SKU. That does indicate good yields and it also indicates they want all dies to be sold as full parts, in which case it makes sense to pair them with 16GB or VRAM to maximise the sell through and the margin.

When they have stockpiled enough salvage parts and / or the sales slow enough on full parts that it is worth disabling dies they may release the cut down SKU. I think a 96bit 12GB card will be a higher margin part for AMD and a more desirable and less compromised card for consumers relative to a cut down 8GB card.

If AMD did release a 12GB part then they should stop manufacturing the 8GB XT and just make 16GB versions.

As for the $250 price, that assumes around 6700XT performance. It is entirely possible such a cut part may be faster than that and be closer to the 5060. In which case the sale price could be a bit higher. It is just a very rough estimate and more a guide of what perf/$ could be in the hypothetical. It is not a fully fleshed out business case with evidence...
 

Thunder 57

Diamond Member
Aug 19, 2007
3,805
6,407
136
Way to miss the point. The 8GB SKU is a waste of a fully functional die, they sit on shelves and sell for a lower price than the BOM saving so have lower margins to boot.

When you launch a products at the same time it is often due to segmentation. When you launch products at different times and perhaps have a limited run it is often salvage. Think 3300X and the 5600X3D or 7600X3D.

There is no cut SKU. That does indicate good yields and it also indicates they want all dies to be sold as full parts, in which case it makes sense to pair them with 16GB or VRAM to maximise the sell through and the margin.

When they have stockpiled enough salvage parts and / or the sales slow enough on full parts that it is worth disabling dies they may release the cut down SKU. I think a 96bit 12GB card will be a higher margin part for AMD and a more desirable and less compromised card for consumers relative to a cut down 8GB card.

If AMD did release a 12GB part then they should stop manufacturing the 8GB XT and just make 16GB versions.

As for the $250 price, that assumes around 6700XT performance. It is entirely possible such a cut part may be faster than that and be closer to the 5060. In which case the sale price could be a bit higher. It is just a very rough estimate and more a guide of what perf/$ could be in the hypothetical. It is not a fully fleshed out business case with evidence...

And you know the BoM how? A 96 bit bus would only limit memory bandwidth even more and since it uses GDDR6 might not be enough. I don't think that's the case but I don't know. Also these plans were put together some time ago and keeping the 128 bit bus may have made it easier. They may also have not seen the 8GB predicament back then. I think a 10/12GB version would be ideal as well just like Battlemage or the 6700 (XT). I think they should've learned from Nvidia and just made a 16GB version. But they may have already ordered 8GB versions and have to sell them rather then scrap them.

Naming is getting closer to my old ATI 9700 Pro.

That was a legendary card. I had a 9800 Pro only because I upgraded later but very similar. The 9500 Pro may have been the best deal but ATi probably didn't make much money on that as they quickly replaced it with the 9600 Pro which was sometimes worse. Takes me down memory lane.
 
  • Like
Reactions: Mopetar

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,123
3,058
146
Well, that sucks that so many cards are hopping on the melty connector wagon.
 

ToTTenTranz

Senior member
Feb 4, 2021
450
834
136
Well, that sucks that so many cards are hopping on the melty connector wagon.
It's probably not that melty on a 300 TDP graphics card.

If e.g. 2 of the 6 pins lose contact in this card, you're pushing 7.5A from the other 4 which is within spec (9.5A). The problem with the RTX 5090 is it constantly peaks to 600W, meaning it pushes 15A on the same situation, which is >57% above spec.

I also don't know if the 9700 Pro shunts all the 12V pins internally like most RTX 5090 cards. It probably doesn't, as it's a horrible choice since the card loses all means to detect if there's a bad contact. Nvidia and AIBs really dropped the ball on that one.
 

jpiniero

Lifer
Oct 1, 2010
16,490
6,983
136

Heartbreaker

Diamond Member
Apr 3, 2006
5,026
6,592
136
I also don't know if the 9700 Pro shunts all the 12V pins internally like most RTX 5090 cards. It probably doesn't, as it's a horrible choice since the card loses all means to detect if there's a bad contact. Nvidia and AIBs really dropped the ball on that one.

It probably does. It's actually in the spec for the connector, that it should join them all together.
 

ToTTenTranz

Senior member
Feb 4, 2021
450
834
136

It would either be 12 or 24GB. 24GB would have been the better choice IMO, but it would also put the 16GB 9070/XT in an awkward position.



It probably does. It's actually in the spec for the connector, that it should join them all together.
Some AIB models like the Astral 5090 are going off-spec on this so they're able to read the current on each pin, with good reason.
 
  • Like
Reactions: Mopetar

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,680
31,534
146
Aussie Steve explains his fine wine testing. How some seem to have misunderstood it, and why he is highly confident in his data.

 
  • Like
Reactions: Mopetar

Panino Manino

Golden Member
Jan 28, 2017
1,109
1,360
136
Aussie Steve explains his fine wine testing. How some seem to have misunderstood it, and why he is highly confident in his data.


The real FineWine™ was the Steves© we made along the way.


If I remebe right, FineWine™ started with the CPUs.
Many years ago someone noticed that their usually weaker for gaming Bulldozer was actually smoother the it's faster Intel competitors while streaming and playing at the same time.
 

Z O X

Junior Member
Oct 31, 2022
15
10
51
Is there a toggle to switch to lower res than 4k?
9070XT isn't a 4k GPU and in such GPU heavy situations differences are minimal ...