Idontcare has done a ton of tests with all sorts of thermal pastes etc. and IIRC he believes that Intel didn't cheap out as much as you might think.
Yeah the TIM itself that Intel uses is at least every bit as good as NT-H1 (a premium TIM in its own right).
The problem with using the TIM instead of the solder is the
overall height of the gap that exists between the CPU and the underside of the IHS.
If Intel filled that gap with solder instead of TIM then the gap isn't a problem. But filling that large of a gap with any non-solder TIM is going to cause the thermal bottlenecking that we see with IB, its not a matter of using expensive or cheap TIM at that point.
Delidding pays off because we reduce the gap in the process of replacing the stock TIM.
I think someone (IDC) mentioned the possibility of thermal paste accommodating extra flex which plagued Nvidia in the bumpgate incident.
Its way more frustrating on the other team when their cpus could suck down 200W despite being spec'd as a 125W tdp.
As an engineer that is far and away my top concern with delidding. There are mechanical forces in play because of thermal expansion and the mismatch in materials (silicon's coefficient of thermal expansion versus that of copper).
It is not unreasonable to suspect that Intel's engineers steered clear of the rigid solder and instead opted to adopt the more flexible thermal pad intermediate entirely out of concern over the cyclical thermomechanical stresses that are generated in heating and cooling of the CPU die.
But we cannot be certain either. The mechanical forces are surely present, but they may not be large enough to be a practical concern. And the decision to avoid solder may have purely been one of cost-reduction, in which case we enthusiasts are not compromising anything in delidding as it were.