First of all, your calculation looks weird - what kind of defect density did you use? I used 0.14 as this gives 90% for the Zen4 CCD - which sounds reasonable. Did you use 300mm wafers as is the industry standard?
So going by the numbers of my previous post and applying your wafer costs the difference is 67 USD vs. 20,87 USD, or 46 USD or 320%. So factoring in your 14 USD for the IOD there are still 32 USD left per unit. After packaging still around 20.
And if you think that 20 bucks per unit of pure variable production costs are not significant, then I honestly don't know what. to. say. These costs define the lowest possible price for which it is worth for AMD to produce and sell a unit. And all I am saying is that for me Raphael as a basis for future low cost SKUs in the Desktop is more likely than Phoenix Point.
What I wrote are dies per 300mm wafer, so both bad and good ones, that's why I wrote "Just an example".
If you want to say that It's flawed, then yes It is, but yours is too, because you calculate only the good dies. Where did you leave the bad ones? There are still many usable dies as cut down versions. So your calculation is also flawed. I think at least 1/2 of them can be reused.
I honestly don't understand why are you nitpicking about $10-20 difference in production cost. It's still not more than $60, while the APU would sell for >$200. AMD would still have very healthy margins on them.
BTW, It's not like I said Phoenix Point has to be cheaper than comparable Raphael. Why should It be?
Phoenix has a 12CU RDNA3 IGP, which should perform similarly or better than GTX 1650 or RX 6400 4GB and those cards cost ~ €175-190 in my country.
I don't think It would be a problem even If Phoenix Point cost $50 more than comparable Raphael, which would easily cover the higher production cost.
It would still be a lot cheaper than Raphael + GTX 1650(RX 6400).
If someone doesn't need the strong IGP in Phoenix Point or want even faster dGPU, then just buy the cheaper Raphael or Intel + dGPU If needed.