Discussion Intel current and future Lakes & Rapids thread

Page 70 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ehume

Golden Member
Nov 6, 2009
1,511
73
91
There is one thing an 8700k can do: at 5GHz it goes through a lot of watts. When I use it for comparison heatsink testing, a 5GHz 8700k running Linpack with AVX2 gets 150+ watts and allows me to compare top heatsinks. I can compare the bleeding edge.

If there is an 8-core Coffee Lake and it can run at 5GHz, the only an AIO or water rig will keep it cool.

Then again, my use is a really really niche case.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
As someone who does pretty much everything still running on an OC'd 2500k, I would ask most people what exactly are you doing that you need the top of the line every other year? Maybe it's age, but it just doesn't make sense to upgrade all the time. You ask about 'improvements' but short of high end video editing, CAD and things of that nature, what exactly does a home user do that needs any of it at this point. Gaming isn't raising the bar much at all and hasn't in many years. (Serious question as I can't justify upgrading when everything I want to do works fine - and I do quite a bit).

The last one is actually not true depending on what and how you play. Or said otherwise multiplayer especially 64-player battlefield maps are notoriously CPU bound. Here more cores > 4 really do help. Or a faster quad. But yeah it's a small niche. You need to play these games and that at 1080 144 hz because if you jack up the resolution you are going to be GPU bound very quickly.
Else it's mostly about being an enthusiast and simply building PCs, testing new cpus etc is your hobby even if you know the new one isn't really worth it.

That said my CPU is older than yours so I can totally also see your viewpoint
 

dullard

Elite Member
May 21, 2001
25,054
3,408
126
There is one thing an 8700k can do: at 5GHz it goes through a lot of watts. When I use it for comparison heatsink testing, a 5GHz 8700k running Linpack with AVX2 gets 150+ watts and allows me to compare top heatsinks. I can compare the bleeding edge.

If there is an 8-core Coffee Lake and it can run at 5GHz, the only an AIO or water rig will keep it cool.

Then again, my use is a really really niche case.
Why do you even need a CPU to compare top heatsinks? Wouldn't a power resistor on a block do just the exact same thing? Then you can crank up the watts to anything you can imagine.
 
  • Like
Reactions: Dave2150

TheGiant

Senior member
Jun 12, 2017
748
353
106
How does the new IGP compare to dGPUs for mining?

Do you think the new intel iGPU can play Witcher 3 1080p low detail?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
How does the new IGP compare to dGPUs for mining?

Do you think the new intel iGPU can play Witcher 3 1080p low detail?

Intel iGPUs don't really perform well in mining. Even Vega iGPU performs poorly in mining, because it lacks memory with high bandwidth. Vega iGPU on the desktop can play Witcher 3 at 1080p low detail. The Gen 11 GPU in Icelake will need a titanic leap over current graphics to get to that level. It looks like it'll also need driver optimizations.

That is half correct as in, transistor level performance translated little into frequency at any Ghz levels.

Transistor performance correlated quite well with frequency in the 2, 3GHz levels. At 4+ GHz, all CPUs run into serious thermal issues that prevent it from going much beyond that.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
As someone who does pretty much everything still running on an OC'd 2500k, I would ask most people what exactly are you doing that you need the top of the line every other year? Maybe it's age, but it just doesn't make sense to upgrade all the time. You ask about 'improvements' but short of high end video editing, CAD and things of that nature, what exactly does a home user do that needs any of it at this point. Gaming isn't raising the bar much at all and hasn't in many years. (Serious question as I can't justify upgrading when everything I want to do works fine - and I do quite a bit).

It's definitely an age thing. "Works fine" = works, as in functions, though not as well as it could. You're not getting the best experience, but you don't care, so it's no problem.

Go back another generation and you have those > 90 years old who still watch TV in black and white, as it "works fine". Or those watching TV in 480P as they think it looks as good as HD. Colour is an upgrade, don't you think? It's all relative I guess.

If you started playing the latest games on a high refresh rate monitor, or rendering video and cared about the time it took to complete, you'd want something faster than a 2500k.
 

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
It's definitely an age thing. "Works fine" = works, as in functions, though not as well as it could. You're not getting the best experience, but you don't care, so it's no problem.

Go back another generation and you have those > 90 years old who still watch TV in black and white, as it "works fine". Or those watching TV in 480P as they think it looks as good as HD. Colour is an upgrade, don't you think? It's all relative I guess.

If you started playing the latest games on a high refresh rate monitor, or rendering video and cared about the time it took to complete, you'd want something faster than a 2500k.

Makes sense. I've never been a chase the latest. I just buy what is best when I want it, and use it until it no longer functions as I expect. What is different now vs then though was the exponential increases whereas now it is really minimal and a money sink. I'm not saying keeping the same hardware for 10 years is necessary, but neither is upgrading every other year. You aren't gaining much at all.

One thing you mention though is gaming...which..I do. I'm not competative, but I do have a 144hz monitor and run everything at 1440p so I mean...it's still a matter of good enough, but your return on investment is sinking if you are upgrading so often. Difference in mindset I guess.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
Makes sense. I've never been a chase the latest. I just buy what is best when I want it, and use it until it no longer functions as I expect. What is different now vs then though was the exponential increases whereas now it is really minimal and a money sink. I'm not saying keeping the same hardware for 10 years is necessary, but neither is upgrading every other year. You aren't gaining much at all.

One thing you mention though is gaming...which..I do. I'm not competative, but I do have a 144hz monitor and run everything at 1440p so I mean...it's still a matter of good enough, but your return on investment is sinking if you are upgrading so often. Difference in mindset I guess.

I bought a I7 920 on release day and upgraded after many years to a 6700k - I like to get the most out of my hardware before upgrading also. I did see a large jump in performance, which was totally worth it for me.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Go back another generation and you have those > 90 years old who still watch TV in black and white, as it "works fine". Or those watching TV in 480P as they think it looks as good as HD. Colour is an upgrade, don't you think? It's all relative I guess.

It's good to be able to be satisfied with what you have. It reduces stress levels. All the gadgets hit the wallet hard.

Although personally I need at least color. :)
 

ehume

Golden Member
Nov 6, 2009
1,511
73
91
Why do you even need a CPU to compare top heatsinks? Wouldn't a power resistor on a block do just the exact same thing? Then you can crank up the watts to anything you can imagine.
You raise an interesting point. A lot of heatsink makers advertise their wares as cooling 200W, 300W. Maybe. But if I can get only 125 watts (4.8 GHz, Vcore = 1.32V) before going over 100C, then something is . . . different. I hear from a heatsink mfr that they test their products on simulators, essentially what you proposed. I have a suspicion those produce different results from using CPU's.

In any case Intel sells heaters that fit inside sockets, like chips. The product costs $1500. It's cheaper to run chips. My 4790k lasted until the end of 2016, before a heatsink from [expletive deleted] ruined it for further comparisons. But I put a Scythe Fuma on it and it happily runs as my daughter's gaming machine. I do not think a $1500 platen could be repurposed like that.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,541
14,495
136
Why do you even need a CPU to compare top heatsinks? Wouldn't a power resistor on a block do just the exact same thing? Then you can crank up the watts to anything you can imagine.
I don't think that CPUs act like resistors do, so testing would be very limited. ehumes points are also well taken, but I think the real thing (a CPU) is required to get accurate real results.
 
  • Like
Reactions: Drazick and ehume

TheGiant

Senior member
Jun 12, 2017
748
353
106
Well the wattage is not that important as the heat flow density- wattage per area

It is much harder to cool 200W with 80mm2 than 300mm2 and even the area doesn't generate heat uniformly like AVX2 units generate more heat than other areas afaik
 
  • Like
Reactions: pcp7 and lightmanek

dullard

Elite Member
May 21, 2001
25,054
3,408
126
Hot spots come to mind. Don't exist in uniform heaters.
You certainly don't get a uniform heater with what I mentioned (one power resistor would be very non-uniform). You of course would likely have many resistors, placed as needed.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
You certainly don't get a uniform heater with what I mentioned (one power resistor would be very non-uniform). You of course would likely have many resistors, placed as needed.
What?

It's a heating element. Electric iron comes to mind.
 

dullard

Elite Member
May 21, 2001
25,054
3,408
126
What?

It's a heating element. Electric iron comes to mind.
Irons have hot and cold spots. Iron is a pretty terrible conductor of heat. Cast iron pans for example are much worse than copper pans for temperature uniformity.

I was thinking something along the lines of soldering 6 of these:
https://www.mouser.com/datasheet/2/21/C100N50Z4A_Datasheet_RevC-4117.pdf
onto a PCB covered with metal filled vias. Then attach a poorly conducting metal plate (to have hot zones) on the other side. A plate such as this one: https://www.mcmaster.com/#8786k6/=1dcq1f6 You'd have to cut it to about 19 mm by 13 mm and bevel the edges down to a 9.19 mm x 16.28 mm shape to mimmic a 6-core coffee lake CPU.

Then connect it to a variable power supplies up to 50 V (and up to 50 W each). You'd have 6 temperature zones on the chip the exact size of any CPU you wanted with the ability to control the power from 0 W to 300 W.

Not exactly a CPU but way more flexibility and good enough to test heat sinks. All for less than $100 in parts assuming you already have power supplies.

You could do far better, but that is what I came up within a few minutes of work.
 

ehume

Golden Member
Nov 6, 2009
1,511
73
91
Irons have hot and cold spots. Iron is a pretty terrible conductor of heat. Cast iron pans for example are much worse than copper pans for temperature uniformity.

I was thinking something along the lines of soldering 6 of these:
https://www.mouser.com/datasheet/2/21/C100N50Z4A_Datasheet_RevC-4117.pdf
onto a PCB covered with metal filled vias. Then attach a poorly conducting metal plate (to have hot zones) on the other side. A plate such as this one: https://www.mcmaster.com/#8786k6/=1dcq1f6 You'd have to cut it to about 19 mm by 13 mm and bevel the edges down to a 9.19 mm x 16.28 mm shape to mimmic a 6-core coffee lake CPU.

Then connect it to a variable power supplies up to 50 V (and up to 50 W each). You'd have 6 temperature zones on the chip the exact size of any CPU you wanted with the ability to control the power from 0 W to 300 W.

Not exactly a CPU but way more flexibility and good enough to test heat sinks. All for less than $100 in parts assuming you already have power supplies.

You could do far better, but that is what I came up within a few minutes of work.
The bottom line is that reports coming out from OEM's re- their heatsinks vs real-world results on CPU's do not correspond. So I continue to test on CPU chips.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
AnandTech's simulator consits of an aluminum plate milled to the exact shape of an Athlon CPU. On the lower side of the simulator, a foil heater is installed (Minco heater model HK5578R4.6, installed using ArcticSilver epoxy), which provides the heat load. In the center of the simulted "die", a small channel is milled. There, a thermocouple (T-type) is embedded, using ArcticSilver thermal epoxy, too. After installation, the simulated die was lapped.

https://www.anandtech.com/show/729/2
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
One thing to ALWAYS remember.

Empirical data beats theory 100% of the time. If the real world says something different to your predictions, then your theory/model is wrong.

We can all speculate ad infinitum, on designing a simulator that's close to reality, but nothing is as good as the real thing. That's why we still get surprises when building stuff after millions of hrs in simulations.
 
  • Like
Reactions: ehume