• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News AMD FSR 4 Redstone path tracing software uses ML2CODE framework to run on any GPU

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
This sounds unbelievably bad for AMD.

It's AMD basically suggesting they weren't really planning on releasing a RDNA3/2 supported version of FSR4 for planned obsolescence, but then out of sheer incompetence they leaked it, people now know it exists so they might as well just make it public.
what AMD is saying is they want a friends with benefits kind of arrangement without any commitments whatsoever

official redstone for RDNA 3 is a lot of commitment. they are happy with unofficial support for RDNA 3
 
panther lake vs medusa point will not be favorable to AMD
I don't even get Medusa Point's GPU strategy, to be honest.
8CU RDNA3.5 iGPU is too big to use as value-driven 2D GPU for office work, but in 2027 it's also too weak to use in compute and gaming (not to mention lack of FSR4 and Redstone).

Perhaps AMD thought Intel was going bankrupt or frozen throughout the decade, and they just thought the general public was going to be content with 8CUs RDNA3.5 for compute and low power gaming all the way up to 2029.
 
8CU RDNA3.5 iGPU is too big to use as value-driven 2D GPU for office work, but in 2027 it's also too weak to use in compute and gaming (not to mention lack of FSR4 and Redstone).
it's the exact same config as KRK1. Good 'nuff for the mainstream.
Perhaps AMD thought Intel was going bankrupt or frozen throughout the decade, and they just thought the general public was going to be content with 8CUs RDNA3.5 for compute and low power gaming all the way up to 2029.
Graphics does not matter in mobile.
CPU perf and BL do, and that's what mds1 gets you.
 
it's the exact same config as KRK1. Good 'nuff for the mainstream.

What's "mainstream"?
KRK1 is a Q1 2025 chip. Medusa Point will be 2 years later at best.

AMD's RDNA3 on SoCs is on track to become the next Intel Gen9 meme.


Graphics does not matter in mobile.

You should send an e-mail to all those OEMs putting millions of Geforce dGPUs into their laptops, to warn them that they're needlessly spending all this money on something that doesn't matter.
 
What's "mainstream"?
$800-ish SPP client and all the general commercial (think Thinkpad T14's and friends) laptops.
Medusa Point will be 2 years later at best.
well yeah that's the cadence.
AMD's RDNA3 on SoCs is on track to become the next Intel Gen9 meme.
Good news! gen9 was the Intel golden age in mobile. they owned the place and milked it relentlessly.
You should send an e-mail to all those OEMs putting millions of Geforce dGPUs into their laptops, to warn them that they're needlessly spending all this money on something that doesn't matter.
dGFX laptops have healthy margins but as a laptop TAM %% they're a tiny bunch.
iGPUs themselves never mattered at all. Otherwise Kaveri would've been a market smash hit. too bad!
 
Good news! gen9 was the Intel golden age in mobile. they owned the place and milked it relentlessly.

They "owned the place" because they had absolutely terrible competition in the CPU department. OTOH, all the 14nm+++++++plusplus successors with Gen9 gave way for the market to forget Intel even had GPUs of their own, or just mock Intel every time they tried to talk about it.


Intel is still paying dearly for all that "milking" even today. Looking at the state of Intel today and how much money they're losing just to become a footnote in GPU talk, it's safe to say it was a terrible strategy that AMD shouldn't follow.


iGPUs themselves never mattered at all. Otherwise Kaveri would've been a market smash hit. too bad!
Kaveri couldn't hit graphics performance estimates because Elpida went bankrupt before they could sell AMD any substantial amount of the GDDR5M modules the chip was supposed to use in laptops.
AMD engineered Kaveri for ~80GB/s total bandwidth from 128bit GDDR5M and got stuck with 34GB/s from laptop DDR3.

They probably had a whole roadmap of Fusion APUs with big iGPUs with GDDR5M that went down the drain because of Elpida's downfall.
 
They "owned the place" because they had absolutely terrible competition in the CPU department.
It's called winning pal. They had the best CPUs and the best BL.
Intel is still paying dearly for all that "milking" even today
No they ain't, that was great product management.
Intel problem is cultural decay wrt core IP (especially CPU IP).
Looking at the state of Intel today and how much money they're losing just to become a footnote in GPU talk, it's safe to say it was a terrible strategy that AMD shouldn't follow.
Intel GPGPU catastrophe is a self-inflicted wound.
They just couldn't decide on whether they wanted a Nervana roadmap, a GPGPU actual roadmap or Habana roadmap.
Kaveri couldn't hit graphics performance estimates because Elpida went bankrupt before they could sell AMD any substantial amount of the GDDR5M modules the chip was supposed to use in laptops.
It still utterly annilihated Intel on PPA (and general perf).
But no one cared. iGPs just don't really matter.
 
No they ain't, that was great product management.

Calling Intel's 14++++++ era "great product management" is probably the weirdest tech History revisionism I've seen so far.


It still utterly annilihated Intel on PPA (and general perf).
And got utterly annihilated by any tiny Nvidia dGPU because it choked on memory bandwidth, so the extra area for the larger iGPU was wasted (= burning money). Kaveri's graphics performance on DDR3 could be had with an iGPU half as wide.
 
Calling Intel's 14++++++ era "great product management" is probably the weirdest tech History revisionism I've seen so far.
Their foundry exploded but they've still managed to have a solidly competitive mobile and desktop lineup.
It was all solid work. It's later when troubles emerge (you can visibly see the decay of their CPU IP quality).
And got utterly annihilated by any tiny Nvidia dGPU because it choked on memory bandwidth, so the extra area for the larger iGPU was wasted (= burning money).
Again, it crushed Intel.
Did it net AMD any major design wins? No. Because iGFX really really really does not matter. Your laptop SoCs are defined by CPU perf and BL.
 
Last edited:
i agree for most people battery life and CPU efficiency is important in laptops and especially premium laptops.

But Panther lake WILL lead in that as well over Gorgon point. The better iGPU is just a bonus for those laptops
 
Obviously I am not owed any features that weren't promised when I bought my 7900xtx, but what does or doesn't happen with FSR4 is definitely going to factor into the decision for my next upgrade
 
I don't even get Medusa Point's GPU strategy, to be honest.
8CU RDNA3.5 iGPU is too big to use as value-driven 2D GPU for office work, but in 2027 it's also too weak to use in compute and gaming (not to mention lack of FSR4 and Redstone).
Isn't Medusa Point going to use RDNA 3.5+ which has better "AI"/ML functions support (similar to RDNA 4) than RDNA 3.5 ? If that's the case FSR 4 official support could likely be there for it (even if they don't change unofficial support into official for RDNA 3/3.5).

Edit: If that's not the case then Medusa Point's iGPU would indeed seem average, kinda stagnating. XESS XMX(IQ and frame gen capability) is lot ahead of FSR 3.1/3

And got utterly annihilated by any tiny Nvidia dGPU
Yes, precisely and its CPU was kinda terrible too against Haswell esp. ST perf and IPC. MT perf wasn't that good/consistent either. It fared worse in laptops(way worse ST perf and relatively bad efficiency/battery life) than in desktop.
 
Last edited:
Back
Top