Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 03-02-2007, 03:43 PM   #1
phaxmohdem
Golden Member
 
phaxmohdem's Avatar
 
Join Date: Aug 2004
Location: Missouri
Posts: 1,839
Default Why can GPU's get so much hotter than CPU's?

Just curious, I was noticing that the max temp my 7800GTX can do before "core slow down" kicks in is 125 degrees C, how come CPU's only do 60-70 before going heat crazy?
__________________
---( Phenom II 940 || 8GB PC6400 || ASUS M2N32-SLI Deluxe || 2x 250GB R-0 + 2x 500GB R-0 || eVGA 8800GTX & eVGA 8600GT (Quad-Monitors) || Seasonic SS-700HM 700W || Cooler Master ATC-111c-SX2 Case )---

http://www.avxmedia.com
phaxmohdem is offline   Reply With Quote
Old 03-02-2007, 03:44 PM   #2
LeiZaK
Diamond Member
 
Join Date: May 2005
Posts: 3,751
Default Why

__________________
??F????T - sbb? ??ns o? ??? ??? pu? p????ns o? ??? ???
LeiZaK is offline   Reply With Quote
Old 03-02-2007, 04:35 PM   #3
aka1nas
Diamond Member
 
Join Date: Aug 2001
Posts: 4,335
Default Why

GPUs are usually built on an older process than current CPUs of that time.
__________________
Main Rig:
I7 920 D0 @ 4.22Ghz
Asus P6T6 Revolution X58
24GB GSkill DDR3-1333
Radeon 6870
Enermax Galaxy Evo 1250w
GSkill Falcon 128GB SSD x2 (RAID 0)
Intel X25-M 160GB SSD
Seagate 7200.11 1.5TB
Windows 7 Pro 64-bit
aka1nas is offline   Reply With Quote
Old 03-02-2007, 04:38 PM   #4
Matt2
Diamond Member
 
Join Date: Jul 2001
Posts: 4,762
Default Why can GPU's get so much hotter than CPU's?

Without pretending like I know more than I do, each has its own set of thermal characteristics.
__________________
Rig

"The object of war is not to die for your country, but to make the other poor bastard die for his." -Gen. George S. Patton
Matt2 is offline   Reply With Quote
Old 03-02-2007, 05:29 PM   #5
phaxmohdem
Golden Member
 
phaxmohdem's Avatar
 
Join Date: Aug 2004
Location: Missouri
Posts: 1,839
Default Why can GPU's get so much hotter than CPU's?

But GPU's and CPU's are all built in the same silicon processes are they not? Even if GPU's are still built on last gen .13 .11nm processes, those CPu's still would not get to 125C.

The only thing I can possibly think of is that typically when CPU's get hot they start getting less accurate with the calculations? (Ex. SuperPi miscalculations) Do GPU's just not need to be 100% precise, thus allowing for higher thermal tolerances? Are the "pipelines" of a GPU (I know that is an incorrect term) simply shallow enough to not be condusive to heat related errors?

I'm just talking out of my arse as this point, but I'm still very curious to get to the bottom of this.
__________________
---( Phenom II 940 || 8GB PC6400 || ASUS M2N32-SLI Deluxe || 2x 250GB R-0 + 2x 500GB R-0 || eVGA 8800GTX & eVGA 8600GT (Quad-Monitors) || Seasonic SS-700HM 700W || Cooler Master ATC-111c-SX2 Case )---

http://www.avxmedia.com
phaxmohdem is offline   Reply With Quote
Old 03-02-2007, 05:45 PM   #6
PingSpike
Lifer
 
Join Date: Feb 2004
Posts: 20,362
Default Why can GPU's get so much hotter than CPU's?

I have no idea. I have a feeling that errors in a video card aren't catastrophic to the system like cpu ones are. In short...it is fvcking up, just not enough to reboot your machine. Thats just sort of a guess and I would only say it partially explains it though.
__________________
Error reading poptart in Drive A: Delete kids y/n?
PingSpike is offline   Reply With Quote
Old 03-02-2007, 06:10 PM   #7
mooncancook
Platinum Member
 
mooncancook's Avatar
 
Join Date: May 2003
Posts: 2,602
Default Why can GPU's get so much hotter than CPU's?

back in the days we used to have powerful video cards that don't need a cooler (3Dfx Voodoo 2) while all CPUs requires big coolers. I don't know howthe performance-to-heat ratio of video card changes over the years as compared to CPUs. Maybe video card has performance increase at a much faster rate than CPU and therefore the heat also?
__________________
HTPC: i5-4670 | 16GB | 840Pro 128GB + WD Black 2TB | GTX 760
mooncancook is offline   Reply With Quote
Old 03-02-2007, 06:23 PM   #8
xtknight
Elite Member
 
xtknight's Avatar
 
Join Date: Oct 2004
Posts: 12,974
Default Why can GPU's get so much hotter than CPU's?

Disclaimer: I'm talking out of my arse just as far as the rest of you.

I doubt that the GPU operates in any reduced/forced state at a high temp. Wouldn't you see lots of artifacts? Just one wrong bit and something could be completely the wrong color. And, certainly, they have to match the reference tests enough to be passable as DirectX compatible. These reference tests don't vary, they are executed on the CPU.

The chips are probably made out of materials with a higher tolerance (don't ask).

As far as I know, max temp for a Core 2 Duo CPU is specified at either 85C or 100C. Only then does it shut off. So, I don't think there is that much difference in sheer tolerance. A graphics card running at 85C is really the tip of the iceberg too, no? As to why it is hotter, probably just because there's more transistors on a GPU (and it's also done on a bigger manufacturing process).
__________________
Main Rig (Ubuntu Linux 9.04)
The LCD Thread : LCD Resource
NEC 20WMGX2/LCD2690WUXi Owner
xtknight is offline   Reply With Quote
Old 03-02-2007, 06:52 PM   #9
BladeVenom
Lifer
 
BladeVenom's Avatar
 
Join Date: Jun 2005
Posts: 13,542
Default Why can GPU's get so much hotter than CPU's?

Maybe because graphic card makers are more willing to let you burn out your GPU. Graphic cards seem to go bad at a much higher rate than processors.
BladeVenom is offline   Reply With Quote
Old 03-03-2007, 07:19 AM   #10
nullpointerus
Golden Member
 
Join Date: Apr 2003
Posts: 1,326
Default Why can GPU's get so much hotter than CPU's?

Quote:
Originally posted by: phaxmohdem
Just curious, I was noticing that the max temp my 7800GTX can do before "core slow down" kicks in is 125 degrees C, how come CPU's only do 60-70 before going heat crazy?
Maybe it has something to do with clock speed:

7900 GTO: 650 MHz
Core 2 Duo E4300: 1800 MHz
nullpointerus is offline   Reply With Quote
Old 03-03-2007, 08:29 AM   #11
Maximilian
Lifer
 
Maximilian's Avatar
 
Join Date: Feb 2004
Posts: 11,860
Default Why can GPU's get so much hotter than CPU's?

Thats a really good question... why the hell can they run so much hotter?

I think i asked the same question about the pentium M a while back (it can go to 100*C apprently) and why it could go so much hotter than the P4 without breaking. Cant remember the anwer i got though, but its likely the same reason why GPU's can go hotter.
__________________
>2014
>sig line limit of 90 characters
Maximilian is offline   Reply With Quote
Old 03-03-2007, 09:11 AM   #12
VIAN
Diamond Member
 
Join Date: Aug 2003
Posts: 6,575
Default Why can GPU's get so much hotter than CPU's?

nm
VIAN is offline   Reply With Quote
Old 03-03-2007, 09:14 AM   #13
Schadenfroh
Elite Member
 
Schadenfroh's Avatar
 
Join Date: Mar 2003
Location: Boston
Posts: 38,418
Default Why can GPU's get so much hotter than CPU's?

I always thought it was because they do so many things in parallel while CPUs were just single / dual threaded. But, I have no facts or proof to back that up.
__________________
"how we live is so far removed from how we ought to live, that he who abandons what is done for what ought to be done, will rather bring about his own ruin than his preservation"
- Niccolò Machiavelli
Schadenfroh is offline   Reply With Quote
Old 03-03-2007, 09:29 AM   #14
JBT
Lifer
 
Join Date: Nov 2001
Location: AZ
Posts: 11,894
Default Why can GPU's get so much hotter than CPU's?

I'm certain a video card running at 125C is going to have some problems. People here don't really like running over 80C or so and from what I've seen much over 90C and cards like the X1900XT's will start getting artifacts or locking up. Just because the software says 125C before it starts throttling doesn't mean its right.
Also it seems CPU manufacturers are more concerned about not producing so much heat. While GPU's manufacturers seems like they couldn't care less and design them to work with higher tolerances and are able to work correctly with a little more heat added to the equation.
__________________
Intel i7 4790K @ 4.4GHz | 256GB Samsung 830 SSD | GA-Z97MX-Gaming 5 | Asus R9 290 DCUII @ stock | 2 x 8GB GSkill RipJaw DDR3 @ 1833| Cool Master 750
Intel i5 2500K @ 4.4GHz | 256GB Sandisk Ultra Plus | AsRock Z68 Extreme 3 Gen 3| Sapphire 7850 | 2x Seagate 1.5 TB HDD's in RAID1 | CoolMaster 550 watt |2 x 4GB GSkill RipSaw DDR3 1600
HEAT
My journey from Beast to BEAST!
JBT is offline   Reply With Quote
Old 03-03-2007, 09:40 AM   #15
Modular
Diamond Member
 
Modular's Avatar
 
Join Date: Jul 2005
Location: Intarwebz
Posts: 4,885
Default Why can GPU's get so much hotter than CPU's?

Quote:
Originally posted by: nullpointerus
Maybe it has something to do with clock speed:

7900 GTO: 650 MHz
Core 2 Duo E4300: 1800 MHz

I believe he is on to something here. The lower the clockspeed, the higher the tolerance for error. As heat and clockspeed increase, the potential for errors increases as well.

Perhaps the lower clockspeed on the GPU allows it to operate at higher temps without errors.

Basically I'm positing that errors are directly related to some relationship between heat and clockspeed. As either heat or clockspeed decrease, the potential for errors decreases as well.

The basic mechanical build that the GPU is made from allows for lower clock speeds with more transistors, hence higher heat. Processors on the other hand, operate at a higher clockspeed but are restricted to lower temperatures before they break the threshold and errors occur.

I would like to point out that not only did I just come up with the garbage (potentially) that I just wrote, it's also early, I don't have enough coffee in me, and I was up pretty late last night...Spring Break 2k7 wo0t



__________________
quote:
Originally posted by: waggy
i wanted to make fun of you on this. but being a noob sucks.
Modular is offline   Reply With Quote
Old 03-03-2007, 09:44 AM   #16
xtknight
Elite Member
 
xtknight's Avatar
 
Join Date: Oct 2004
Posts: 12,974
Default Why can GPU's get so much hotter than CPU's?

Hrm..Pentium Ms and Core Duos also operate at lower clock speeds. And Prescott operated at insane clock speeds.
__________________
Main Rig (Ubuntu Linux 9.04)
The LCD Thread : LCD Resource
NEC 20WMGX2/LCD2690WUXi Owner
xtknight is offline   Reply With Quote
Old 03-03-2007, 10:23 AM   #17
happy medium
Lifer
 
happy medium's Avatar
 
Join Date: Jun 2003
Location: Philadelphia , P.A.
Posts: 11,190
Default Why can GPU's get so much hotter than CPU's?

Quote:
Originally posted by: nullpointerus
Quote:
Originally posted by: phaxmohdem
Just curious, I was noticing that the max temp my 7800GTX can do before "core slow down" kicks in is 125 degrees C, how come CPU's only do 60-70 before going heat crazy?
Maybe it has something to do with clock speed:

7900 GTO: 650 MHz
Core 2 Duo E4300: 1800 MHz
This is my guess!
__________________
Antec 1200 case (7fans) with a Asus P5Q board, q9550 @ 4ghz @ 1.39 with Tuniq cooler (prime stable @68c), 4 gigs ddr2 1000,2 1tb hard drives, 2 6870 2gb cards @ 980 core, 370$AR shipped, Corsair tx 750, @ 1080p. Shes finally maxed out!
happy medium is offline   Reply With Quote
Old 03-03-2007, 10:24 AM   #18
Gstanfor
Banned
 
Join Date: Oct 1999
Posts: 3,307
Default Why can GPU's get so much hotter than CPU's?

For one thing, GPU's generally use many more transistors than CPU's do. More transistors mean more energy usage and more leakage current (that manifests itself partially as heat).

Also GPU's tend to be fabricated without much in the way of custom design libraries (though this is slowly changing with use of tech such as Arithmatica's CellMath libraries and other custom logic libraries such as those employed in G80's shader ALU's), they tend to be made on relatively low cost merchant processes, whereas CPU's are predominately custom logic and utilize cutting edge fabrication techniques.
Gstanfor is offline   Reply With Quote
Old 03-03-2007, 01:12 PM   #19
Gatt
Member
 
Join Date: Mar 2005
Posts: 81
Default Why can GPU's get so much hotter than CPU's?

Quote:
Originally posted by: Modular
Quote:
Originally posted by: nullpointerus
Maybe it has something to do with clock speed:

7900 GTO: 650 MHz
Core 2 Duo E4300: 1800 MHz

I believe he is on to something here. The lower the clockspeed, the higher the tolerance for error. As heat and clockspeed increase, the potential for errors increases as well.

Perhaps the lower clockspeed on the GPU allows it to operate at higher temps without errors.

Basically I'm positing that errors are directly related to some relationship between heat and clockspeed. As either heat or clockspeed decrease, the potential for errors decreases as well.

The basic mechanical build that the GPU is made from allows for lower clock speeds with more transistors, hence higher heat. Processors on the other hand, operate at a higher clockspeed but are restricted to lower temperatures before they break the threshold and errors occur.

I would like to point out that not only did I just come up with the garbage (potentially) that I just wrote, it's also early, I don't have enough coffee in me, and I was up pretty late last night...Spring Break 2k7 wo0t


Actually, the higher the temperature the more Silicon begins to conduct electricity. Silicon's semi-conductor properties decrease with heat.

GPU's heat tolerance is probably more due to it's lack of onboard cache. Since it's not storing large amounts of data, the leakage isn't as big an issue. A CPU starts leaking and it starts losing it's data in cache, a GPU starts leaking and it really doesn't lose anything that's being stored for long periods of time. Cache data loss will cause CPU errors. All the GPU will lose is a frame. The lower clock speeds are likely a function of heat as well, the more leakage you have, the stronger the signal you need to overcome it. The stronger your signal, the slower your signal(IIRC).
Gatt is offline   Reply With Quote
Old 03-03-2007, 01:38 PM   #20
firewolfsm
Golden Member
 
firewolfsm's Avatar
 
Join Date: Oct 2005
Location: Detroit Area
Posts: 1,757
Default Why can GPU's get so much hotter than CPU's?

I think it's a combination of all the posts so far, which make valid points. The clockspeed, the transistor count, lack of cache, they come together to make a more tolerant chip.
__________________
What I cannot create, I do not understand
firewolfsm is offline   Reply With Quote
Old 03-03-2007, 03:46 PM   #21
Munky
Diamond Member
 
Munky's Avatar
 
Join Date: Feb 2005
Posts: 9,377
Default Why can GPU's get so much hotter than CPU's?

125C is really too high for a GPU to handle. My x1900xt locks up whenever the temps go into the mid 90's, and my older x800gto locked up in the mid 80's, so the card makers might as well set the limit temp to 200C, it wouldn't mean that 199C is a safe operating temp.
__________________
Core i7 @ 3.2-3.8 / AMD 6950 / 12GB DDR3 1600 / Asus Xonar D2 / Samsung 275t / Logitech z5500
Munky is offline   Reply With Quote
Old 03-05-2007, 12:13 AM   #22
Goi
Diamond Member
 
Join Date: Oct 1999
Posts: 6,538
Default Why can GPU's get so much hotter than CPU's?

1) GPU product cycles are much shorter than CPU product cycles, so the physical design engineers don't have much time to hand tune/tweak transistor placement/layout/characteristics. They usually let software do it for them, hence physically they're not as optimal, and this leads to higher power consumption
2) Because of the shorter product cycle, GPU architects also have less time to make a more power/thermal aware architecture and use techniques such as dynamic voltage scaling, clock gating and other power/thermal aware features that modern CPUs have.
3) Modern GPUs have more transistors than CPUs
4) GPUs still lag behind CPUs in process technology by 1-2 generations since ATI/nvidia are fabless and rely on fabs like TMSC/UMC to produce the wafers for them. Intel and AMD however have their own fabs, and intel is especially good at process technology. Now that ATI has been acquired by AMD this may change, at least for DAAMIT cards.
5) GPU coolers are anemic compared to CPU coolers
__________________
In the beginning there was nothing, which exploded
My 5 Rigs
Goi is offline   Reply With Quote
Old 03-05-2007, 10:32 AM   #23
nZone
Senior Member
 
Join Date: Jan 2007
Posts: 277
Default Why can GPU's get so much hotter than CPU's?

Because CPU has massive heatsink and 120mm fan while GPU has a measly small heatsink with small fan?

It's not apple to apple. If one was to mount that Thermalright Ultra 120 on a 8800GTX GPU; the temp would probably less than 20C. It would probably break that GTX in half with the massive weight.
nZone is offline   Reply With Quote
Old 03-05-2007, 11:07 AM   #24
Modular
Diamond Member
 
Modular's Avatar
 
Join Date: Jul 2005
Location: Intarwebz
Posts: 4,885
Default Why can GPU's get so much hotter than CPU's?

Quote:
Originally posted by: nZone
Because CPU has massive heatsink and 120mm fan while GPU has a measly small heatsink with small fan?

It's not apple to apple. If one was to mount that Thermalright Ultra 120 on a 8800GTX GPU; the temp would probably less than 20C. It would probably break that GTX in half with the massive weight.
That doesn't explain the physical limitations of the silicone chip itself though. The question regards why a processor will lock up at much lower temps than a GPU, not why GPU's inherently run hotter.

__________________
quote:
Originally posted by: waggy
i wanted to make fun of you on this. but being a noob sucks.
Modular is offline   Reply With Quote
Old 03-05-2007, 01:08 PM   #25
Nanobaud
Member
 
Join Date: Dec 2004
Posts: 144
Default Why can GPU's get so much hotter than CPU's?

I also can't speak directly to the GPU / CPU comparison, but drawing on analogies I can offer an additional point to consider. The geographical workload in a GPU is probably fairly evenly distributed across the chip. If the chip is at 100°, then the hottest on-chip temp is also probably near 100°. On a CPU, the hardest-working parts of the chip occupy a substantially smaller fraction of the total chip area. Predicting heat-flows and such based on speculation is not much of a road to understanding, but I don't think it unreasonable that a CPU with a chip temperature of 60° above ambient while dumping 100W could conceivably have spot temperatures that are 10°, 20°, or more above that.

Edit:
You could always turn off the thermal safety routines as these folks did:
3.8? GHz Duron

nBd
Nanobaud is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:07 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.