• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Dual GPU Fermi on the way

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I tend to think that the type of people who buy 200-300w TDP GPUs would be plenty willing to buy a 200W TDP CPU. But since they are such a small portion of the market, theres no real need to.




Modern CPUs are restricted to roughly <125W TDP by OEMs, not enthusiasts

It's not only the OEM's that cause it, it's other things like heat dissipation.
A Core i7 die is 263mm^2 and has a TDP of 130w, while the 190w (for the CARD mind, not just the GPU) of the HD5870 has 334mm^2 of area. Some of the power is going to the PCB/RAM etc, whereas on a CPU that's all CPU power.
With a 300w GPU, that 300w is spread over two cores each at 334mm^2.

I would think that thermal considerations come into play at the high end.
A 200w TDP CPU with a smaller die size than a GPU and without the ability to run at such high temps, and with higher power requirements would be pretty difficult to cool without liquid.
 
Back
Top