• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion RDNA4 + CDNA3 Architectures Thread

Page 207 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
1655034287489.png
1655034259690.png

1655034485504.png

With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it :grimacing:

This is nuts, MI100/200/300 cadence is impressive.

1655034362046.png

Previous thread on CDNA2 and RDNA3 here

 
Last edited:
Is that for real? Another 6500XT
3efae6ec34d4f324f0eb933fd8e5a87b.png
 
It absolutely can be an issue even when you have an integrated GPU with encode support, local streaming is a thing, at the moment you cannot use the iGPU to encode while using the discrete GPU for gaming on the host with Moonlight and Sunshine.
AMD is just a disaster on GPU man
 
I still think that AMD needed a card to fight toe-to-toe with the Nvidia halo card. At least amd maximum 4k performance. 8k is for the riches.
 
I still think that AMD needed a card to fight toe-to-toe with the Nvidia halo card. At least amd maximum 4k performance. 8k is for the riches.

I think they tried to do that but failed, I mean the chiplet strategy, they are probably lying about focusing on the midrange as an overall planned strategy for a long time and instead they are just forced to do so after they plan about competing with chiplets failed.
 
AMD should have lied that msrp of 6500xt is $99

Crazy to launch at a price for which it actually sold on the market
Yeah, that would have helped with the reception a lot. Though it's still a bad graphics card. It has issues at 1080p medium in some games.
 
To be fair, the main issues with the 6500XT were the 4GB VRAM, the four PCIe lanes, and the price. The missing encode/decode engine was just the cherry on top.

-It was really all the 4gb RAM. There is an 8gb from Sapphire that runs fine for the specs, sometimes twice as fast as the stock version.

That card couldn't exist at launch though thanks to mining, the 8gb of Vram would have sent the price and the card would probably have ended up in the mines with 8gb of RAM and a lean power profile.

No reason we couldn't get that card today though.

I'd love to see a 24 CU 8GB clocked to the moon entry level card for $100-120. It would sell like hotcakes.
 
I think they tried to do that but failed, I mean the chiplet strategy, they are probably lying about focusing on the midrange as an overall planned strategy for a long time and instead they are just forced to do so after they plan about competing with chiplets failed.

The only time that they really tryed to do this was with the HD 2900 xt back in 2006.
 
-It was really all the 4gb RAM. There is an 8gb from Sapphire that runs fine for the specs, sometimes twice as fast as the stock version.

That card couldn't exist at launch though thanks to mining, the 8gb of Vram would have sent the price and the card would probably have ended up in the mines with 8gb of RAM and a lean power profile.

No reason we couldn't get that card today though.

I'd love to see a 24 CU 8GB clocked to the moon entry level card for $100-120. It would sell like hotcakes.
A wider PCIe bus also would have increased performance. The combination of the two were what caused the terrible performance on the 6500 XT.

The fact is, you can't compromise on both VRAM capacity and PCIe bandwidth. To a degree you can get away with handicapping one or the other, but doing both is disastrous for performance. It also creates a situation where the 6500 XT is overly sensitive to exceeding the VRAM buffer, even by a little bit, whereas products such as the GTX 1650 Super have a lot more leeway.
 
Intel is too poor and QC too incompetent. 0/2.

But them it wouldn't be "my team" embarrassing itself.
Yes, I know it's an irrational feeling, it's like football teams, I can't help it since I "discovered" myself on the side of AMD back in the mid 1990.

If AMD really gave up them give up for real.
Stick doing "gaming" GPUs only for the IGPs, and at most launch only one or two discrete GPUs for generation. One the "sweet spot" mainstream and the other the low-end for old PCs. If they don't believe they can make real money to be truly competitive, or do something like this or cut their losses on exiting this market.
 
exiting this market.
Full exit is always too expensive to pull off.
Stick doing "gaming" GPUs only for the IGPs, and at most launch only one or two discrete GPUs for generation. One the "sweet spot" mainstream and the other the low-end for old PCs
well they're making three, one for the poor people, one for the middle class, and a kilobuck offering on top of these. But no halo or any fancy chiplet wondertech devices.
 
Is that for real? Another 6500XT
3efae6ec34d4f324f0eb933fd8e5a87b.png
No, that's a good idea. Media engines take donkeyloads of die space and what use are they, really? Quality encoding needs to be done through software encoders, game streaming and game recording belongs on the CPU's integrated media engine because it's more efficient there.

It's inefficient to pay for the media engine footprint twice, and most AM5 CPUs have media engine (iGPU) now (the main exception is 7500F, with 8700F and 8400F in tow but those don't sell that much I think).

So this is 100% a smart thing to do. PCIe ×8 interfaces, too.

(But it probably will be used against AMD in the guerrilla marketing / volunteer shilling comments that tell you how paying more for the same performance on Nvidia cards is "the more you save", Huang math or something.)
 
Back
Top