Why wouldn't it be close to fusion?
The hardware is just executing an abstraction.
If GMA would have supported OpenCL (we might also count the former OpenGL compute stuff), this would still be like a CPU and a GPU connected via QPI instead of AGP/PCIe, with the MC even sitting on the GPU. It got closer together, but the differences started to appear, when for example mem pages were shared between the different compute hardware instances. I think, one main goal for Fusion/HSA was to reduce the overhead of the "old-world" way of doing CPU-GPU computing, as
depicted here. If GMA would have had OpenCL support, it might even count as package level "Fusion" according to the linked roadmap.
The 1.2b is probably way wrong. But say at 100M for design and mask in same process vs 10 usd for an interposer one have to be very carefull what path you take.
For a company that makes no profit 100M is a huge amount and exposes it to treats and risk. Risk is money. Thats why you pay insurance or huge cooperations invest in derivatives. To minimize risk.
Think about this. Intel in Otellini style goes for the economic impact and reaction.
They know amd have invested 100m in some huge apu for specific segment. What do you think their response is?
Indeed, it's also a matter of costs. So if AMD in your example doesn't plan to sell more than 10M units over the next years, the interposer might be better. But aside from this being likely or not, there might be other issues, like limited interposer production or packaging capabilities.
I think, regarding investing 100M or not depends a lot on a products' prospects, not only on available money.
If given the choice between a cookie cutter system with APU and raw CPU+dGPU at the same cost and largely the same performance (for the sake of it, let's just say APU is slightly faster), which one would average layman choose? That's what mindshare stands for. You can actually witness that in laptop market where bottom-tier dGPUs are coupled with mainstream -U chips landing similar performance as Iris -U chips at similar cost and worse power consumption. Sometimes it gets so bad, that even mainstream GT2 iGPU has the same or better performance as dGPU and the latter does nothing but waste power. Want to take a shot at what version sells more units?
That APU vs. CPU+dGPU discussion has been repeated many times. And the layman might buy a system with "Quadcore processor" and "Radeon graphics" stickers on it. Does he care about it being an iGPU system? Why did AMD sell APUs all these years? System builders/OEMs might prefer to put one component on the board (plus RAM) instead of two.
What would the average layman do according to your opinon? And is it black/white or can we talk percentages? I don't want to hear that 50.1% would buy discrete GPUs, so the argument has been won, this is no election. 49.9% can still be enough to do business.
I don't care about stupid laptop configurations. There are also good ones. Nobody needs to buy the bad designs. How many systems get sold, which have Iris graphics and a 2-3x faster dGPU?