Sandy, Ivy and Haswell have all seen huge thermal improvements from delidding. Why is Intel not fixing this at the manufacturing level? By tightening up the tolerance on the the IHS and putting on a higher quality thermal compound in an appropriate quantity, they can get 10-20C drops in temps. So why aren't they correcting this? It seems like it would be money saving all around, less material in the IHS, less compound used, better thermal performance...so why not do it?
Edit: I'm sure they could figure out an easy way to 'spray" an appropriately thin layer of compound on if the mass production/ automation part is the issue?
Edit: I'm sure they could figure out an easy way to 'spray" an appropriately thin layer of compound on if the mass production/ automation part is the issue?
Last edited:
