I'm now beginning to wonder whether the 'new manufacturing process' issue that was cited as a cause for the IVB delay may be related to these odd temperature results. Namely, not an issue with the 22nm process itself, but something with assembly/packaging.
Why? Take a look at the difference in power/temperature results between the overclocked 3570k and 3770k. Total system power consumption is only off by 1W idle, 4W load... but there's a 10C idle and 21C load temperature difference? It would have been nice if they'd provided power consumption for a CPU only load, but there definitely seems to be something off with that result. Yes, given same power draw IVB should run hotter than SNB due to the smaller die size, but the increased thermal conductivity should be somewhere between the process scaling (47%) and die size reduction compared to SNB (74%.) In the Tweaktown testing, comparing non-overclocked to overclocked power/temperature delta (assuming that the entire delta of power is due to CPU) the 2600k is at 2.8 W/C (2.8 watts causes a 1C temperature rise) while the 3770k is at 1.3 W/C and the 3570k is at 1.7 W/C. So compared to the 2600k, the 3770k is actually slightly worse than process scaling at 46%, whereas the 3570k is pretty much right in the middle of expectations at 60%. So maybe their 3770k sample is older and suffered from the 'manufacturing process' issue, while the 3570k is newer and fixed?
Why? Take a look at the difference in power/temperature results between the overclocked 3570k and 3770k. Total system power consumption is only off by 1W idle, 4W load... but there's a 10C idle and 21C load temperature difference? It would have been nice if they'd provided power consumption for a CPU only load, but there definitely seems to be something off with that result. Yes, given same power draw IVB should run hotter than SNB due to the smaller die size, but the increased thermal conductivity should be somewhere between the process scaling (47%) and die size reduction compared to SNB (74%.) In the Tweaktown testing, comparing non-overclocked to overclocked power/temperature delta (assuming that the entire delta of power is due to CPU) the 2600k is at 2.8 W/C (2.8 watts causes a 1C temperature rise) while the 3770k is at 1.3 W/C and the 3570k is at 1.7 W/C. So compared to the 2600k, the 3770k is actually slightly worse than process scaling at 46%, whereas the 3570k is pretty much right in the middle of expectations at 60%. So maybe their 3770k sample is older and suffered from the 'manufacturing process' issue, while the 3570k is newer and fixed?
