The question is, how much are games (and other apps) actually using the GPU. Same is true of Apple GPU. Both had to build it before games would start to focus on it, but will we see a shift to that support growing? Its been very slow so far.
Software support is huge. Its why the Switch and Steamdeck punch well above their weights. But it also shows that there's more to making a device worth using for something than just performance/specs or features.
Its exciting to finally see these types of chips. I'm frankly disappointed as I feel like we could have and should have had products like these several years earlier. I feel like Apple just did a very logical idea (lowest of low hanging fruits) and embarassed the market (because everyone else was refusing to make such chips), seeing that consumers don't really need dGPU outside of gaming, and even for that, many could get by with integrated GPU if you just made it strong enough. AMD could have and should have been building chips like these years ago (since they already were for consoles). They would've been popular. We likely would have had viable Steamboxes, which would have actually made the move to a portable easier/better. It would've helped AMD immensely (both in showing up Intel - which AMD waited until Intel started talking about making such chips; but also it would've helped keep Nvidia from dominating as much since I think console like chips would've gotten a lot of focus of development; there's posts of mine from years ago arguing that Zen could've really changed things if we'd gotten premium APUs from it at the outset). The saddest part though is it seems in many ways we're getting these when it almost looks like games making use of the resources available is at some of the worst its been. Last time it was this bad, it caused AMD to make Mantle which led to DX12 and Vulkan. Further disappointing on that front, I think that's what AMD wanted to do back in the early 2010s (and was a big part behind their development of Mantle even), but they were so starved for resources and then the construction cores weren't competitive (although their design was actually about enabling large mixed CPU/GPU).
I'm wondering if there's any chance we might see dGPUs come out of this from others. That would be interesting. Even Apple could benefit (3D modeling for artists and CAD; especially if they build in 3D scanning features into the Vision Pro, imagine being able to go look at real objects, laser scan them into models, and then adjust them on the fly). And it could get really interesting if anyone builds dGPUs that go very heavy on ray-tracing hardware (maybe leaving raster for large integrated GPU.