Will we see CPUs with integrated vapor chamber heat spreaders?

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
This is a question that's been popping into my mind from time to time recently. As process technology improves and hot transistors are concentrated into ever smaller areas, we're seeing clear signs that a simple copper IHS as a protector between desktop CPUs and a HSF/water block is becoming a limitation. As I'm (barely) old enough to rememeber the days of bare AMD CPUs, third-party CPU shims and cracked dice in the early 2000s, I doubt we'll be returning to non-IHS packaging any time soon for the desktop/DIY crowd. Laptops already go this route, but that's a far more controlled environment.

As such, I'm wondering if we'll see high-end SKUs where the IHS is a tiny vapor chamber. Of course this would be a significant cost increase from the cast/pressed copper lids used today, but as 120W+ chips shrink towards/below 200mm2, might it become a necessity?

Of course, Intel seems to be going the opposite direction, with lower-end SKUs cheaping out on cooling in any way they can, making this work by improving power efficiency rather than performance. Above 60-90W, though, this seems like it would backfire significantly, as shown by the prevalence and results of delidding chips like the 7700K.

On the other hand, higher-end chips have the IHS soldered on. Would this even be possible with a vapor chamber? Not only would it require far more power (again, increasing costs) to heat up to the temperature needed to melt the solder (around 170C for the Ryzen series, according to Der8auer), but could a vapor chamber survive temperatures in that range without taking damage? Would this require manufacturers to adopt "liquid metal" TIM?

Also, might we see third-party vapor chamber IHSes marketed to delidders? Would this at all be possible without breaking cooler compatibility due to the added thickness? And what about mounting pressure? LGA CPUs need relatively high mounting pressure to maintain a secure connection to the motherboard - could an ultra-thin IHS vapor chamber withstand this kind of pressure over time without buckling?

What do you think?
 

NTMBK

Lifer
Nov 14, 2011
10,496
5,947
136
Intel are too cheap to even solder down their IHS. No way they'll go for vapor chamber!
 

frowertr

Golden Member
Apr 17, 2010
1,372
41
91
Never. Heat dissipation is a form factor issue. Thats not for intel to decide how OEMs design the inside of a chassis to accomplish this feat.
 

NTMBK

Lifer
Nov 14, 2011
10,496
5,947
136
Never. Heat dissipation is a form factor issue. Thats not for intel to decide how OEMs design the inside of a chassis to accomplish this feat.

Intel could replace their 20 year old ATX motherboard layout with something better suited to today's >100W CPUs and >200W GPUs. Something with a fixed CPU socket location and GPU socket location- yes, socket, use MXM or something like it- so that cases can be designed with integrated and efficient cooling.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Never. Heat dissipation is a form factor issue. Thats not for intel to decide how OEMs design the inside of a chassis to accomplish this feat.
How does this have anything to do with the fundamental problem of getting heat off of a densely packed die at a sufficient rate to avoid overheating as the die shrinks ever more? At a certain point, the size, shape and functionality of the HSF stop mattering as the ability of the IHS to spread out heat enough to efficiently transfer it to the HSF diminishes due to die/process shrinkage. What form factor-specific innovations can ameliorate the problem that the hot part of the CPU is ~10% of your heatsink's base? What does having 10 heatpipes matter if only one of them is being heated by any significant amount? Are you arguing that the die "form factor" is the issue? I guess that can be said, but unless you propose an entirely new way of IC manufacturing, how does this matter? I don't think you quite understand what I'm asking.
Intel could replace their 20 year old ATX motherboard layout with something better suited to today's >100W CPUs and >200W GPUs. Something with a fixed CPU socket location and GPU socket location- yes, socket, use MXM or something like it- so that cases can be designed with integrated and efficient cooling.
See above. Modern heatsinks and AIOs are perfectly capable of dissipating >200W of heat. The biggest issue coming up is allowing them to do their job (i.e. getting the heat to the heatsink), not the form factor in which they're housed.
Care to expand on that?
Intel are too cheap to even solder down their IHS. No way they'll go for vapor chamber!
Well, they do so on HEDT and server chips. For mobile, this is a non-issue as the chips have no IHS and thus everything is left to the OEM. The only parts where they cheap out are the regular desktop parts. On $1000+ server chips, what would stop them from doing this.
 

frowertr

Golden Member
Apr 17, 2010
1,372
41
91
Are you arguing that the die "form factor" is the issue? I guess that can be said, but unless you propose an entirely new way of IC manufacturing, how does this matter? I don't think you quite understand what I'm asking.

Have they made statements eluding to the issue of heat being the stopping point for future development?

Everything ive read points more towards a silicone issue with such small ICs. Read nothing on heat other than everything is getting cooler NOT hotter.

Intel would much rather let OEMs take on heat dissipation issues. Servers for example need much more heatsink and airflow than do consumer computers. Its not a one size fits all solution.
 

scannall

Golden Member
Jan 1, 2012
1,960
1,678
136
Seems like something that IBM would do for their POWER series. Cost isn't an issue on those. But, for consumer level CPU's? Won't happen.
 

RichUK

Lifer
Feb 14, 2005
10,341
678
126
Until this is a problem, there is no need to develop such solutions.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
LMAO intel has REGRESSED from solder to shitty TIM and you think they will spring for a vapor chamber? lol

CPU cooling for intel at least is actually moving backwards not forwards.

I could see this for server chips but there is a 0% chance this will happen on desktops.
 

Jan Olšan

Senior member
Jan 12, 2017
604
1,199
136
The Wraith Spire cooler that AMD gives into the box of Ryzen 7 1700 and Ryzen 5 1500X and 1600 actually: https://www.youtube.com/watch?v=p8qeLXjiVms&feature=youtu.be&t=1m45s

Integrating into HS is another thing, but Id argue that it probably isn't a good idea, plus it still doesn't solve the TIM problem (of chip to HS transfer). If you are concertned about this, better idea is either to make HS thinner, or to do away with it and use the metal bracket/spacer around approach that GPU BGA packages have.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Have they made statements eluding to the issue of heat being the stopping point for future development?

Everything ive read points more towards a silicone issue with such small ICs. Read nothing on heat other than everything is getting cooler NOT hotter.

Intel would much rather let OEMs take on heat dissipation issues. Servers for example need much more heatsink and airflow than do consumer computers. Its not a one size fits all solution.
A lot of engineering effort in CPU design these days is put into avoiding grouping hot parts of the chip too close together. We haven't quite reached the point where large amounts of dark silicon is necessary to avoid thermal runaway, but cooling is getting progressively more difficult at any given wattage, yes (and for high end parts, a node shrink is seen as an opportunity to increase performance, not lower power usage). Although it's unclear what will happen first of this and silicon process shrinkage stopping, there's bound to be a point at which the hot parts are so small and grouped so close together as to necessitate large parts of the die left empty simply to avoid thermal runaway - at which point the chips would grow due to the added dark silicon, increasing manufacturing costs.

As for "everything getting cooler NOT hotter", that's debatable. In the low-end/midrange consumer space, sure. But high-end consumer chips and HEDT chips have stayed at the same wattages or risen in the past 10 years. A persistent rumor now is that Intel will be launching its first >100W 4c8t CPU since the first Core i7 series this year. As process improvements slow down, perf/w is definitely not going to keep dropping like it has. I don't think we'll see official 200W HEDT chips simply because of the demands this would place on the entire PC (VRM, ventilation, so on), but I do think we'll be seeing higher-wattage enthusiast/HEDT CPUs in coming years.

The point being: a 95W chip still produces 95W of heat wether it's on 22nm, 14nm or 7nm - the difference is the area that heat is spread over, and the effect this has on further dissipation. The more concentrated the heat, the faster and hotter the chip will heat up, and the higher the demands of thermal conductivity for the IHS.

LMAO intel has REGRESSED from solder to shitty TIM and you think they will spring for a vapor chamber? lol

CPU cooling for intel at least is actually moving backwards not forwards.

I could see this for server chips but there is a 0% chance this will happen on desktops.
Define "desktops." I mentioned 120W+ CPUs, which would encompass the Intel HEDT line as well as their server line. Do you see it as impossible there too? Not to mention that I directly address your "point" of Intel cheaping out on low-end chips in the OP. Or did you not read that? 'Cause you seem to be arguing against something I never said.
 

Charlie22911

Senior member
Mar 19, 2005
614
231
116
It is not just about the TIM, I have a Broadwell-E system and those are soldered. 4.2GHz @ 1.275v AVX workloads see thermals go crazy, and that’s with a full custom water loop including 6x 35x120mm radiator area.

The silicon just can’t move heat out of the functional units and into the heat spreader fast enough and I don’t see a vapor chamber IHS changing that moving forward.

IIRC there was talk of using dice with fluid capillaries for cooling many moons ago; I think something along those lines will end up being necessary as we get denser designs.

EDIT - Spelling
 
Last edited:

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
The silicon just can’t move heat out of the functional units and into the heat spreader fast enough and I don’t see a vapor chamber HIS changing that moving forward.

IIRC there was talk of using dice with fluid capillaries for cooling many moons ago; I think something along those lines will end up being necessary as we get denser designs.
I remember reading about that too, but I can't imagine that happening outside of entirely bespoke systems. Intriguing concept, though. But why don't you think a vapor chamber IHS would change the behaviour you describe? After all, the great strength of vapor chambers is distributing heat quickly and evenly (and from a quick Google search, from 5-100x better than copper). As such, (in a perfect setup) it would very noticeably lower the temperatures of any hot spots on the die.
 
  • Like
Reactions: Charlie22911

Rifter

Lifer
Oct 9, 1999
11,522
751
126
A lot of engineering effort in CPU design these days is put into avoiding grouping hot parts of the chip too close together. We haven't quite reached the point where large amounts of dark silicon is necessary to avoid thermal runaway, but cooling is getting progressively more difficult at any given wattage, yes (and for high end parts, a node shrink is seen as an opportunity to increase performance, not lower power usage). Although it's unclear what will happen first of this and silicon process shrinkage stopping, there's bound to be a point at which the hot parts are so small and grouped so close together as to necessitate large parts of the die left empty simply to avoid thermal runaway - at which point the chips would grow due to the added dark silicon, increasing manufacturing costs.

As for "everything getting cooler NOT hotter", that's debatable. In the low-end/midrange consumer space, sure. But high-end consumer chips and HEDT chips have stayed at the same wattages or risen in the past 10 years. A persistent rumor now is that Intel will be launching its first >100W 4c8t CPU since the first Core i7 series this year. As process improvements slow down, perf/w is definitely not going to keep dropping like it has. I don't think we'll see official 200W HEDT chips simply because of the demands this would place on the entire PC (VRM, ventilation, so on), but I do think we'll be seeing higher-wattage enthusiast/HEDT CPUs in coming years.

The point being: a 95W chip still produces 95W of heat wether it's on 22nm, 14nm or 7nm - the difference is the area that heat is spread over, and the effect this has on further dissipation. The more concentrated the heat, the faster and hotter the chip will heat up, and the higher the demands of thermal conductivity for the IHS.


Define "desktops." I mentioned 120W+ CPUs, which would encompass the Intel HEDT line as well as their server line. Do you see it as impossible there too? Not to mention that I directly address your "point" of Intel cheaping out on low-end chips in the OP. Or did you not read that? 'Cause you seem to be arguing against something I never said.

HEDT maybe, but i still doubt it, fact is desktops are such a small market % nowdays(and HEDT much much less) i dont see intel spending the money on them for this. Unless maybe AMD start to hurt them and take a large % of market share, then maybe they will look for ways to draw customers back and then i could see something like this being a way to draw people back, but even then likely only on the HEDT line.
 

Charlie22911

Senior member
Mar 19, 2005
614
231
116
I remember reading about that too, but I can't imagine that happening outside of entirely bespoke systems. Intriguing concept, though. But why don't you think a vapor chamber IHS would change the behaviour you describe? After all, the great strength of vapor chambers is distributing heat quickly and evenly (and from a quick Google search, from 5-100x better than copper). As such, (in a perfect setup) it would very noticeably lower the temperatures of any hot spots on the die.

Thermal conductivity of copper is ~400 W/m K, while silicon is ~149 W/m K (both values from Wikipedia); that is a hard bottleneck that cannot be removed by putting something on top of the die AFAIK.

But also be aware than I’m an enthusiast, so I’m likely oversimplifying the problem\just plain wrong.
 

maddie

Diamond Member
Jul 18, 2010
5,185
5,580
136
I remember reading about that too, but I can't imagine that happening outside of entirely bespoke systems. Intriguing concept, though. But why don't you think a vapor chamber IHS would change the behaviour you describe? After all, the great strength of vapor chambers is distributing heat quickly and evenly (and from a quick Google search, from 5-100x better than copper). As such, (in a perfect setup) it would very noticeably lower the temperatures of any hot spots on the die.
The heat energy at the source has a certain area with which to conduct [die area]. Even if the surface of the die is very cool, the internal conduction becomes the limiting factor.

If silicon remains the main material then the only way I see long term improvement is to have higher heat conduction pathways within the die to reduce local hotspots. Through-silicon vias, but for heat not electrons? Dark silicon leads to bigger dies and are probably less efficient as you're still stuck with silicon heat conduction rates.
 
  • Like
Reactions: Charlie22911

maddie

Diamond Member
Jul 18, 2010
5,185
5,580
136
Thermal conductivity of copper is ~400 W/m K, while silicon is ~149 W/m K (both values from Wikipedia); that is a hard bottleneck that cannot be removed by putting something on top of the die AFAIK.

But also be aware than I’m an enthusiast, so I’m likely oversimplifying the problem\just plain wrong.
You're right. Physical laws rule.
 

Bouowmx

Golden Member
Nov 13, 2016
1,150
553
146
Somebody school me: why not adopt direct-die cooling [as] in GPUs?

Edits added in brackets
 
Last edited:

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Somebody school me: why not adopt direct-die cooling in GPUs?
GPUs? They already use that. You mean CPUs? Because it's already been done, and that experience showed us the ease with which people cracked their CPUs when mounting coolers when the die is unprotected.
 

Excessi0n

Member
Jul 25, 2014
140
36
101
You could use better heatspreader materials. Copper/diamond composites can have substantially higher conductivities than pure copper. A nanotube composite would be even better, but the tubes would need to be aligned; nanotubes have fantastically high conductivity along their long axis, but are otherwise horrible.

I can't imagine Intel doing this, though, if they don't even use solder on heatspreaders.

If silicon remains the main material then the only way I see long term improvement is to have higher heat conduction pathways within the die to reduce local hotspots. Through-silicon vias, but for heat not electrons? Dark silicon leads to bigger dies and are probably less efficient as you're still stuck with silicon heat conduction rates.

Having thermal pathways throughout the interior of the chip is inevitable. It's something that will be necessary when we start making 3D logic in addition to 3D storage.

It seems to me like the ideal material for a thermal TSV would be diamond, grown directly on/in the die using chemical vapor deposition.
 

eek2121

Diamond Member
Aug 2, 2005
3,461
5,114
136
I don't understand what all this chatter is about cooling. Cooling isn't an issue for desktop CPUs. Case in point: My 32nm Core i7 2600k runs at 5 GHz, but doesn't break 55 degrees C on my desktop under full load with a quality closed-loop liquid cooler. A good air cooler will keep it under 60. Is there some heat issue that you are having with your setup? I expect the heat dissipation issue to get better, not worse, with die shrinks.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,415
404
126
I don't understand what all this chatter is about cooling. Cooling isn't an issue for desktop CPUs. Case in point: My 32nm Core i7 2600k runs at 5 GHz, but doesn't break 55 degrees C on my desktop under full load with a quality closed-loop liquid cooler. A good air cooler will keep it under 60. Is there some heat issue that you are having with your setup? I expect the heat dissipation issue to get better, not worse, with die shrinks.
Sandy was the last decent line with a good thermal interface between the die and IHS (not counting the still-soldered HEDT parts)
It all went to crap with IvyBridge and onwards. Yay artificial segmentation :mad:

Want a modern soldered chip? Ryzen or Intel HEDT.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I don't understand what all this chatter is about cooling. Cooling isn't an issue for desktop CPUs. Case in point: My 32nm Core i7 2600k runs at 5 GHz, but doesn't break 55 degrees C on my desktop under full load with a quality closed-loop liquid cooler. A good air cooler will keep it under 60. Is there some heat issue that you are having with your setup? I expect the heat dissipation issue to get better, not worse, with die shrinks.
Sandy was the last decent line with a good thermal interface between the die and IHS (not counting the still-soldered HEDT parts)
It all went to crap with IvyBridge and onwards. Yay artificial segmentation :mad:

Want a modern soldered chip? Ryzen or Intel HEDT.
Not to mention that this entirely disregards the very premise of this discussion: that process node shrinks concentrate hot spots into ever smaller areas, compounding issues of sufficient and effective heat dissipation.

@eek2121 This is a fundamental IC design issue: a 95W 14nm chip (given optimal area scaling) concentrates it heat output into 1/4 the area of a 28nm chip of the same wattage. As such, there's less contact area for coolers, and due to the entire die shrinking these hot spots are clustered closer together. Your 2600K is a 216mm2 die, while a 6700K (couldn't find a number for KBL) is a ~122mm2 die with a significantly larger portion of the die used by fixed-function hardware (video encoders and decoders, ++) and the iGPU (24 EUs vs 12). So, under a high pure CPU load, the same amount of heat is being produced in an area that's probably around 1/3 of the size, if not smaller.

For reference, annotated die shots of Sandy Bridge and Kaby Lake quad core dies respectively:
850px-sandy_bridge_%28quad-core%29_%28annotated%29.png

650px-kaby_lake_%28quad_core%29_%28annotated%29.png

While the CPU cores still take up a very significant part of the die, they're far smaller in relation to the iGPU, and when taking into account that the die itself is roughly half the size of the SB die, you're looking at far more concentrated heat output. In practical terms: Say you have a Hyper 212 Evo. If the SB die layout meant 2-3 heatpipes in your cooler might be directly above the CPU cores, for Skylake this might be reduced to only one heatpipe being in the right place. This is where the IHS comes in, as its job is to spread this heat output as quickly and efficiently outwards so that your entire CPU cooler can do its job. If it doesn't, your CPU cores might reach thermal runaway while your heatsink is barely warm. After all, if only one of four heatpipes gets heated up, you're cutting down your heatsink's cooling potential to 1/4.