I don't know why they would be that mad though? It's got the same BoM as 6800XT essentially and that launched with a $650 MSRP, which certainly had some profit margin baked in. Given the time, I would expect yields to be solid. It still commands a small premium over the 6800XT as well. I am guessing AMD is "fine" with ~$600-$700 for the 6950XT in the way that my is "fine" with me
No doubt a similar performing 7800XT would be more efficient out of the box and have a better margin (maybe we would doubt it?), but it's not like they are totally MIA from the segment. nvidia has pushed the pricing models up high enough that you can imagine some at AMD are wondering why they would pay to develop a new $600 part at all when what they have is "fine" and they continue to "win" at prices sub $600 where surely many sales are being made in the retail channel. By winning I mean their products are already highly competitive on all fronts and they are probably wooing as many GPU curious buyers as they can.
Sure it's no EPYC or even a FireGL chip but its not like the 7nm TSMC chips are the hot stuff that they want to be using for those products anyway.
With regards to the 4070, the real sin is that the RDNA 2 parts are old, not that they are too slow or really too power hungry with even a modicum of tuning. I am certain that is lip service now given how expansive the power budgets for CPUs and GPUs have been in the last few years, and anyone that really cares about power consumption is going to be limiting frame rates, etc. to keep their power bill in check.
What I would like like to see (for grins) is a power usage graph at say capped 90 fps at 1440P high settings on these cards. Every one would argue what those settings would be, I get it, but running uncapped FPS and measuring power usage is silly given how trivial it is to trim it down. I suppose it's like every test being run at "Ultra" settings when we know their is likely something in that "ultra" that kneecaps frame rates and is nearly imperceptible in game.