Just curious since you are quite vocal about AMD putting too few CUs on its APUs: Would you prefer AMD putting logic into its APUs that can accelerate something DLSS-alike instead increasing the CU count?
That depends on the quality vs performance vs adoption rate, a DLSS-like implementation can make a 720P upscaled to 1080p to look better or equal than the same game being rendered whiout upscaling at 900p with higher fps and avoids having to use lower than native screen resolutions, thats great for a APU.
Bear with me on this, what is always important is to use the same Windows screen resolution as your game resolution, because that allows you to use borderless, and if the game does not support using a diferent renderer resolution you are forced into fullscreen if the iGPU (or GPU) cant provide enoght FPS.
So what happens right now with a APU? they run games at 720P, 900P and sometimes 1080p, but most of the games runs at 900 or 720P, as 1080P has a large fps impact on APUs. But no one has 720P or 900P screens anymore, actually, they are still around in my country but you know what i mean. So with a 1080p monitor + APU you are forced into fullscreen in most games because dropping Windows resolution below 1080P is a no-go, AMD has a hardware upscaler that upscales 720p/900p fullscreen games to your windows resolution, that not everyone knows its there and is disabled by default, but it works well, and it solves all issues of running below the native resolution on some monitors, including blurry images and weird noises.
Personally i hate using fullscreen, but thats just my opinion, so, what a DLSS-like solution for APU can do? Well if Picasso had it, you could play games at with a screen resolution of 1080p (or 1440P because if i do work in the pc as well i like having a higher resolution), using borderless, with a renderer resolution of 720p, with visual quality of 900p (or higher) and higher fps than using 900p. Thats the pros.
The cons would be... adoption rate, if it is like DLSS were we have maybe 10-15% of new games using it, its worthless, how heavy it is for the memory bandwidth vs just more raster perf, size, how big this is? And finally quality it needs to produce a image quality that looks better than one rendered at a higher resolution with less fps impact.
So i would need to know all that to be sure if its better to have that or just more raw power. At any rate, software upscalers are used in a lot of games these days and they help A LOT for low end hardware like APUs, or when you are trying to run a game at a higher resolution than you should in order to keep it borderless, DLSS works just like that only that it produces way better results. So i really dont understand the hate just because Nvidia is pushing its use in RT games.