Igor mentions 60W for GDDR6x, yet he also wrote 2.5W/module and that would mean 2.5 * 12 = 30W.

I think memory controller is not included and that is probably ~10-20W more, so 70-80W in total.
As a continuation to my older post.
Link
I still question If using HBM wouldn't be better than MCD+GDDR6.
It would consume less power than GDDR6.
It was estimated at 20W for 16GB HBM2 + 10W for controller or 30W in total.
Link
With RDNA3 you also have to include Fanout to the power consumption, so I think ~80-90W power consumption is not unreasonable for MCD+GDDR6.
You would save 50-60W and that is a lot especially for mobile.
Cost for 8GB HBM2 was $175($150 memory + $25 interposer) vs $52-68($6.5-8.5 per module) for GDDR5 at the time.
Link
A huge difference certainly, but this was in 2017.
In 2019 you could supposedly get 16GB HBM2(4 stacks) for $120 so $145 with interposer, If that didn't get cheaper.
Link
Cost of GDDR6 in 2019 was $10.79-11.69 for 12-14gbps 1GB modules If you bought 2,000 units, that would put It at $173-187 for 16GB, but big customers have up to 40% discount so only $104-112.
Link
Currently we already have 2GB modules, but at digikey the price is $26.22 per unit so $210 for 16GB or $105 for 16GB with 50% discount.
Link
RX 7900XTX has 24GB Vram and that would mean
$26.22*12 for Vram + 6 * $6.2 for MCD we are already at $
352 just for the chips. If the discount is
40-50% then $
157-189 Vram + $37.2 MCD for a total of $
194-226.
For HBM you could either choose HBM2E with 4 stacks(32GB) for a total of 1843 GB/s.(+92% over N31)
Or HBM3 with 2 stacks(32GB) for a total 1639 GB/s.(+71% over N31)
That should be enough to feed N31.
My conclusion is that HBM would consume less power and cost less to make. If 2 stacks(32GB) of HBM3 cost more than what 4 stacks(16GB) cost in 2019 I would be surprised.
P.S. Does someone have a subscription to
Techinsights? They do analysis for GPU's Bill of Materials.
edit: I wrongly calculated the cost of 24GB Vram, so I fixed It and It's bolded out.