Oh no, another Skylake or 5820K thread!

abbcccus

Member
Feb 10, 2012
62
1
71
Fry's had the 5820K for $288 the other day, so I bought one. The question is whether or not I should keep it. There's really only one issue that I'd like anyone who is using one can address: how much heat does this thing throw off with a moderate overclock?

My office is small. Really small. 120 sq. feet or so. My first quad core was a Q6600 that I ran at 3.0 for a winter, but when the Texas spring / summer came around I had to take it off line as firing it up would raise the temperature in here by nearly five degrees. The next quad I bought was an i7 860 and life has been fine ever since. The 5820K, if I decide to keep it, potentially takes me back to where I was. I know that Skylake will put out less heat, but is the 5820K all that bad? If I run it around 4.3 or 4.2 am I going to be right back where I was with the Q6600? TIA!
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
you are not going to get the same MT performance out of skylake to easily, I mean the 4.2GHz 6 core haswell, so the power comparison, I don't know... you can run the 5820K at stock, or undervolt it or something
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
I don't have the numbers, but I would guess idle power has decreased significantly since core2 days.
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
Fry's had the 5820K for $288 the other day, so I bought one. The question is whether or not I should keep it. There's really only one issue that I'd like anyone who is using one can address: how much heat does this thing throw off with a moderate overclock?

My office is small. Really small. 120 sq. feet or so. My first quad core was a Q6600 that I ran at 3.0 for a winter, but when the Texas spring / summer came around I had to take it off line as firing it up would raise the temperature in here by nearly five degrees. The next quad I bought was an i7 860 and life has been fine ever since. The 5820K, if I decide to keep it, potentially takes me back to where I was. I know that Skylake will put out less heat, but is the 5820K all that bad? If I run it around 4.3 or 4.2 am I going to be right back where I was with the Q6600? TIA!
How do you know that? TDP has gone up with same clock speeds (vs Haswell) & if anything I'm expecting I'd be even harder to cool, especially with lesser die space & more cramped room for transistors.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
I don't have the numbers, but I would guess idle power has decreased significantly since core2 days.

it depends, 45nm C2D power usage was not to bad, also later motherboards tended to be a lot better...

but compared to Q6600 OC and 965 chipset for example, yes idle power usage is greatly improved, maybe less than half...

my 45nm C2Q (xeon 771 quad core) 2.8GHz + g41 board is actually not to bad, around 43W idle on windows and 100% load adds +- 50W to that..
 

abbcccus

Member
Feb 10, 2012
62
1
71
How do you know that? TDP has gone up with same clock speeds (vs Haswell) & if anything I'm expecting I'd be even harder to cool, especially with lesser die space & more cramped room for transistors.

If I wait for Skylake I'm not so concerned about cooling it - I can always just back off the overclock as I'm not a max overclocker anyhow. Well, except for this stupid 4770K which I refused to give up on and still only ended up at 4.3. So I have no doubt that the TDPs we're talking about for Skylake won't be a problem. Now Haswell-E is another story as its TDP is my cause for concern - especially with a bit overclocking in play. Since I already have a 4770K at 4.3 I don't really want to run the 5820K much below that.
 

abbcccus

Member
Feb 10, 2012
62
1
71
it depends, 45nm C2D power usage was not to bad, also later motherboards tended to be a lot better...

but compared to Q6600 OC and 965 chipset for example, yes idle power usage is greatly improved, maybe less than half...

my 45nm C2Q (xeon 771 quad core) 2.8GHz + g41 board is actually not to bad, around 43W idle on windows and 100% load adds +- 50W to that..

I ran the Q6600 in a P45 board for awhile, but when it was really kicking out the heat it was in an nForce 650i board.
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
If I wait for Skylake I'm not so concerned about cooling it - I can always just back off the overclock as I'm not a max overclocker anyhow. Well, except for this stupid 4770K which I refused to give up on and still only ended up at 4.3. So I have no doubt that the TDPs we're talking about for Skylake won't be a problem. Now Haswell-E is another story as its TDP is my cause for concern - especially with a bit overclocking in play. Since I already have a 4770K at 4.3 I don't really want to run the 5820K much below that.
So you're aiming for ~4GHz or above, water cooling or a high end air cooler is the way to go then. I'm pretty sure a hex core Haswell-E with, with solder TIM, will be not that much harder to cool than a Skylake at say ~4.3GHz & no solder btw :p
 

abbcccus

Member
Feb 10, 2012
62
1
71
So you're aiming for ~4GHz or above, water cooling or a high end air cooler is the way to go then. I'm pretty sure a hex core Haswell-E with, with solder TIM, will be not that much harder to cool than a Skylake at say ~4.3GHz & no solder btw :p

Yeah, that's pretty much what I was thinking.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
How do you know that? TDP has gone up with same clock speeds (vs Haswell) & if anything I'm expecting I'd be even harder to cool, especially with lesser die space & more cramped room for transistors.

Just because it runs hotter and has a higher TDP, doesnt mean it puts out more heat..........
 

R0H1T

Platinum Member
Jan 12, 2013
2,583
164
106
Just because it runs hotter and has a higher TDP, doesnt mean it puts out more heat..........
Well it does, sort of, broadwell with 65W TDP & clocked at ~4.3GHz will (almost) certainly be cooler than Skylake with it's 95W TDP & clocked the same. This is definitely true with shrinkage & how a more densely packed chip (Skylake at 14nm) will compare with a less densely (Haswell at 22nm) packed one. I haven't mentioned leakage because you can get a golden chip that'll run cooler & consume less power than say Broadwell, even when overclocked, but that's more an anomaly than the norm.
 
Aug 11, 2008
10,451
642
126
OP, why do you need 6 cores? If you are concerned about heat and dont run any heavily multithreaded software, sell the 5820k for a profit on e-bay and go with a 65 watt quad skylake. Or even an i3 if you dont do any gaming or other cpu intensive tasks on that particular machine. A couple of posters on this and the VC and G forum are starting to make vociferous argument for hex cores or above for gaming. To me it is still not that clear which is better for gaming, a slightly lower clocked hex or a high clocked quad. But for sure if heat output is a concern, I think a quad core is still a very viable choice.

And there seems to be some confusion between temperature of the chip and heat output. Because of solder vs TIM, a hex core *might* be easier to keep cool, but will still put out more heat to the room than a lower TDP quad, even though the temperature of the quad may be higher.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
I would keep it. Just undervolt the CPU and back off the clock speed 100mhz or so. The extra cores will make up for it. You'll be suprised how low the voltage will go. This can save you in the area of 25 - 35 watts easily. Just grab a Killawatt if you don't already have one for measurements. You can do this with RAM and video cards as well to save even more. A lot of computer components come overvolted by default for higher yields.
 

abbcccus

Member
Feb 10, 2012
62
1
71
OP, why do you need 6 cores? If you are concerned about heat and dont run any heavily multithreaded software, sell the 5820k for a profit on e-bay and go with a 65 watt quad skylake. Or even an i3 if you dont do any gaming or other cpu intensive tasks on that particular machine. A couple of posters on this and the VC and G forum are starting to make vociferous argument for hex cores or above for gaming. To me it is still not that clear which is better for gaming, a slightly lower clocked hex or a high clocked quad. But for sure if heat output is a concern, I think a quad core is still a very viable choice.

And there seems to be some confusion between temperature of the chip and heat output. Because of solder vs TIM, a hex core *might* be easier to keep cool, but will still put out more heat to the room than a lower TDP quad, even though the temperature of the quad may be higher.

I don't need it, I want it! Messing with computers is one of my hobbies, but I neither game nor rely on them for my livelihood in any way. I do some audio tinkering and a bit of transcoding and both of those would benefit from more cores up to a point. The fact is this has nothing to do with need and everything to do with the hobby. In my office, currently surrounding me, are a 3770K Hackintosh, a 4770K machine on which I do most of my work, a 3570K machine that I use for 24/7 concert sharing via bittorrent and as a DVR, and an FX-8320 that I'm using to finally rip my music collection and as a machine to fool around with Linux. I recently gave my 2500K to my wife and the daughter is long overdue for an upgrade - the 3570K will be hers after I build another machine. The 4770K based machine is now over two years old and I want a new toy! I could wait for Skylake, but this deal came up on the 5820K and I've seen several decent X99 boards in the $120 range . . . and there you go. Obviously I'm aware that I'm not going to get any huge performance gains whichever way I go, but the itch is bad at this point. It just seemed to me that getting a 5820K machine for about the same, or maybe even a little less, than Skylake would be well worth it. My only concern is with how much more heat the 5820K is likely to dump into the room than Skylake likely will, Well, that and if somehow or another Skylake offers a 25% increase in IPC over Broadwell plus 5 GHz+ overclocking . . .
 

maddogmcgee

Senior member
Apr 20, 2015
410
421
136
I don't need it, I want it! Messing with computers is one of my hobbies, but I neither game nor rely on them for my livelihood in any way. I do some audio tinkering and a bit of transcoding and both of those would benefit from more cores up to a point. The fact is this has nothing to do with need and everything to do with the hobby. In my office, currently surrounding me, are a 3770K Hackintosh, a 4770K machine on which I do most of my work, a 3570K machine that I use for 24/7 concert sharing via bittorrent and as a DVR, and an FX-8320 that I'm using to finally rip my music collection and as a machine to fool around with Linux. I recently gave my 2500K to my wife and the daughter is long overdue for an upgrade - the 3570K will be hers after I build another machine. The 4770K based machine is now over two years old and I want a new toy! I could wait for Skylake, but this deal came up on the 5820K and I've seen several decent X99 boards in the $120 range . . . and there you go. Obviously I'm aware that I'm not going to get any huge performance gains whichever way I go, but the itch is bad at this point. It just seemed to me that getting a 5820K machine for about the same, or maybe even a little less, than Skylake would be well worth it. My only concern is with how much more heat the 5820K is likely to dump into the room than Skylake likely will, Well, that and if somehow or another Skylake offers a 25% increase in IPC over Broadwell plus 5 GHz+ overclocking . . .

My little heater next to me is 2500 watts and my air con is about 11000 watts. Is an extra 100-150 watts from an overclocked CPU designed to do computer things not heat efficiently going to make any real difference?
 

RaistlinZ

Diamond Member
Oct 15, 2001
7,470
9
91
Fry's had the 5820K for $288 the other day, so I bought one. The question is whether or not I should keep it. There's really only one issue that I'd like anyone who is using one can address: how much heat does this thing throw off with a moderate overclock?

My office is small. Really small. 120 sq. feet or so. My first quad core was a Q6600 that I ran at 3.0 for a winter, but when the Texas spring / summer came around I had to take it off line as firing it up would raise the temperature in here by nearly five degrees. The next quad I bought was an i7 860 and life has been fine ever since. The 5820K, if I decide to keep it, potentially takes me back to where I was. I know that Skylake will put out less heat, but is the 5820K all that bad? If I run it around 4.3 or 4.2 am I going to be right back where I was with the Q6600? TIA!

What exactly will you be doing with your computer? For general desktop use, there's very little heat at all. Mine runs at 38C during general use, even overclocked. During gaming, temps don't even reach 60C under a high end cooler.

I don't think you have to worry about heat so much unless you're running your CPU at 100% all the time.
 

abbcccus

Member
Feb 10, 2012
62
1
71
What exactly will you be doing with your computer? For general desktop use, there's very little heat at all. Mine runs at 38C during general use, even overclocked. During gaming, temps don't even reach 60C under a high end cooler.

I don't think you have to worry about heat so much unless you're running your CPU at 100% all the time.

Goofing off most of the time, but, like I said above, some audio manipulation, some transcoding, I run a VM or two for kicks, it'll mostly be idle, so it may not be that big of a deal. My current 4770K doesn't really create any issues for me and I'd like to keep it that way. As noted above, the processor doesn't run very hot itself (my 4770K runs in the low 30s), but that doesn't mean it isn't putting out a ton of energy into the room. Hell, the Q6600 ran at about the same temperature, but it dumped a ton of extra heat into the room as it was using much more electricity once overclocked.
 

abbcccus

Member
Feb 10, 2012
62
1
71
My little heater next to me is 2500 watts and my air con is about 11000 watts. Is an extra 100-150 watts from an overclocked CPU designed to do computer things not heat efficiently going to make any real difference?

Yes. In the first place, I live in Texas. In the second place, the room has one vent for climate control. As I noted above, it's a small room. It can get out of control in here in a hurry. I kid you not, in the winter time that Q6600 kept this room nice and toasty and when spring and summer came around it would go into the mid 80s in here. That's with the rest of the house around 76. It made this room completely unbearable. I have to leave my office door closed because my wife's cats will come in here and they like to chew things, bump into things, walk all over everything, and leave their fur everywhere. I suppose that it's fair to say that the extra heat isn't immediately a problem, but it builds up rather quickly. Ideally I'd like to use the 5820K to replace both my 4770K and 3570K by running Win 7 in a VM for DVR purposes, but I think that that'll have to wait for another generation or two and I'll just use the 5820K when I need it.
 

maddogmcgee

Senior member
Apr 20, 2015
410
421
136
Yes. In the first place, I live in Texas. In the second place, the room has one vent for climate control. As I noted above, it's a small room. It can get out of control in here in a hurry. I kid you not, in the winter time that Q6600 kept this room nice and toasty and when spring and summer came around it would go into the mid 80s in here. That's with the rest of the house around 76. It made this room completely unbearable. I have to leave my office door closed because my wife's cats will come in here and they like to chew things, bump into things, walk all over everything, and leave their fur everywhere. I suppose that it's fair to say that the extra heat isn't immediately a problem, but it builds up rather quickly. Ideally I'd like to use the 5820K to replace both my 4770K and 3570K by running Win 7 in a VM for DVR purposes, but I think that that'll have to wait for another generation or two and I'll just use the 5820K when I need it.

Fair enough, I used to live in the Australian desert where it would be a sunny, cloudless 35-50 (122 Fahrenheit) degrees Celsius for months at a time, so I feel ya. Difference is I could open my study door so didn't notice the PC as much. Avoiding a voltage increase with your overclock would help alot. Also, you could grab a more efficient PSU at the same time. If your current PSU is around 80 percent efficiency (much of the rest going to warming your room), getting a Seasonic in the low 90's would save you a bunch of heat.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,735
155
106
I have the same problem with heat in my house in so california.
The monitor puts off as much heat as the computer itself, I can feel it radiating out.
80F+ is guaranteed for me during daytime in my house unless it's winter, the monitor surface is typically around 120F.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Yes. In the first place, I live in Texas. In the second place, the room has one vent for climate control. As I noted above, it's a small room. It can get out of control in here in a hurry.
Buy yourself a ~$120, 5-6,000 BTU window unit air conditioner, then. Sure, it will add $10-15 a month to your electric bill, but that's less than all of those computers are adding, and it will very easily cool such a small room.
 

abbcccus

Member
Feb 10, 2012
62
1
71
Buy yourself a ~$120, 5-6,000 BTU window unit air conditioner, then. Sure, it will add $10-15 a month to your electric bill, but that's less than all of those computers are adding, and it will very easily cool such a small room.

Haha!!! I've thought about doing just that, but my wife would KILL me. She would, and rightfully so, point out that the simple solution is to NOT BUILD A MACHINE THAT REQUIRES SUCH MEASURES!! Since I don't really need the power I was just trying to get a sense from people who already have such machines if they're blast furnaces. I was running my FX-8320 machine on a modest overclock for a while but I had to put it back to stock (I should probably undervolt it) because it put out nearly as much heat as the Q6600 once did. All of the computers in my office (I have others not mentioned above) are either off or sleeping for most of the day and night except for the torrenting / DVR machine.
 

abbcccus

Member
Feb 10, 2012
62
1
71
I have the same problem with heat in my house in so california.
The monitor puts off as much heat as the computer itself, I can feel it radiating out.
80F+ is guaranteed for me during daytime in my house unless it's winter, the monitor surface is typically around 120F.

I recently retired an old monitor because of that! You could feel the waves of heat coming off the back and front of that thing - it was ridiculous.
 

abbcccus

Member
Feb 10, 2012
62
1
71
Fair enough, I used to live in the Australian desert where it would be a sunny, cloudless 35-50 (122 Fahrenheit) degrees Celsius for months at a time, so I feel ya. Difference is I could open my study door so didn't notice the PC as much. Avoiding a voltage increase with your overclock would help alot. Also, you could grab a more efficient PSU at the same time. If your current PSU is around 80 percent efficiency (much of the rest going to warming your room), getting a Seasonic in the low 90's would save you a bunch of heat.

My wife has stories of that kind of heat from her days in Arizona as an undergrad. I hate the heat, but I'd take higher temps and no humidity over what we have here. I figure that probably a slight voltage increase would be ok, but you're right: start pumping it up and efficiency suffers. I have a Seasonic PSU that is rated Bronze but all the test sites showed it around 85 percent efficiency, so that's not terrible at least. Since I don't game, I can get away with a relatively weak video card to help with the cause.