• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Intel Skylake / Kaby Lake

Page 634 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tamz_msc

Diamond Member
Jan 5, 2017
3,012
2,746
136
Posted this in another forum, may as well post it here too:

People tend to upgrade GPUs faster than they upgrade CPUs, and GPUs become obsolete much faster.
I rather spend a bit more on a better gaming CPU which I won't upgrade again for many years than spend more on a GPU that I'll upgrade in just a few years.
This is some seriously flawed logic, which is true only if you're going to be playing the same game year after year. Even a game like CS:GO or Dota 2 isn't the same game as it was a few years ago. How games utilize resources over time change. Just like it's silly to say 'wait for games to become multithreaded then Ryzen will kick a$$', it’s also quite naive to assume that with a CPU that is slightly slower today will end up bottlenecking a 3080 Ti tomorrow so bad that it'd make someone wish he'd got the faster CPU back then.
This was and should be the norm, but the people on the slower gaming cpu found a way to feel good about their setups with talks of "now" and "real world." And yet, they're the first to talk about platform longevity. It's always performance deferred with these guys; "too expensive," "only two fps," "no noticeable difference," etc.
2600Ks and 4790Ks are still going strong you know? The 2600K maybe not so much but 4790K holdouts aren't complaining about their gaming performance when compared against the 'Lakes.

You're basically arguing that those with a slightly slower CPU should feel bad for themselves for their choice of when and what CPU to get.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
13,650
2,768
136
Older result - ST Score = 191 , MT Score 1420
Latest result - ST Score = 196 , MT Score = 1230


Why do you believe the last result is not representative but you believe the old one is with such a low ST score ??

From my view, the old result could be at a fixed 4.3GHz and thus the lower than 7700K ST score , when the last result could actually been at default clocks and thus the lower MT score.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,012
2,746
136
Shouldn't you rather be criticizing those, like VirtualLarry, who have misconstrued the 'actual result' of this benchmark and are openly celebrating it, instead of the guy trying to correct the mistake, even if he fails to notice that the benchmark was run at pegged base clock?
Which is exactly how I responded to Larry in a follow-up post.

Personally I find this hair-splitting about leaked scores extremely silly, which if you consider them in an objective manner, are perfectly within the realm of plausibility.
 

JoeRambo

Golden Member
Jun 13, 2013
1,389
1,375
136
it’s also quite naive to assume that with a CPU that is slightly slower today will end up bottlenecking a 3080 Ti tomorrow so bad that it'd make someone wish he'd got the faster CPU back then.
2600Ks and 4790Ks are still going strong you know? The 2600K maybe not so much but 4790K holdouts aren't complaining about their gaming performance when compared against the 'Lakes.
The irony is that difference between OC'ed "lake" versus Ryzen is larger than going for full OC 2700K to full OC "lake".
The irony is there are plenty of people who went SB->HSW->SKL/KBL hunting for what was 5-10% ST perf gain, yet ADF guys have no problem recommending them what barely beats full OC 2600K performance in gaming?
(source) http://www.gamersnexus.net/guides/2867-intel-i7-2600k-2017-benchmark-vs-7700k-1700-more/page-3
 

tamz_msc

Diamond Member
Jan 5, 2017
3,012
2,746
136
The irony is that difference between OC'ed "lake" versus Ryzen is larger than going for full OC 2700K to full OC "lake".
The irony is there are plenty of people who went SB->HSW->SKL/KBL hunting for what was 5-10% ST perf gain, yet ADF guys have no problem recommending them what barely beats full OC 2600K performance in gaming?
(source) http://www.gamersnexus.net/guides/2867-intel-i7-2600k-2017-benchmark-vs-7700k-1700-more/page-3
Ah yes, that "i7 in production, i5 in gaming" disease, seems like a 7900X is also suffering from this ailment, and it's an i9 nonetheless!


Seems like they got a raw deal. OC performance isn't even that important here, look at gamers nexus charts for 7700K stock vs 5.1GHz OC.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,474
136
  • Like
Reactions: crashtech

ezodagrom

Junior Member
Mar 4, 2009
7
11
81
This is a tough one because it's all about balance. The general consensus for gaming builds is to allocate more of the budget towards the GPU rather than the CPU. So for arguments sake you would say, get an i7 + GTX 1070 rather than an i5 + GTX 1080, correct? Even though the i5/GTX 1080 would provide the better gaming experience until the next GPU upgrade.

I do see your POV though, as the i7 would be better equipped to handle the next generation of GPUs compared to the i5, ie. less likely to bottleneck.
Something like that, yeah. I intend to go with i7 8700K as soon as I can afford it and other components that I'll need, afterwards I don't intend to upgrade CPU again until a few years after DDR5 reaches mainstream or so.
Until then I'll just be going with mainstream graphics cards every other release (currently I have a 1060, probably will upgrade to 1260, 1460 and such, unless AMD releases something more power efficient than Nvidia in the meantime) and maybe get a 2TB SSD if they ever get as low as the current prices for 1 TB SSDs.
 

Zucker2k

Golden Member
Feb 15, 2006
1,672
999
136
This is some seriously flawed logic, which is true only if you're going to be playing the same game year after year. Even a game like CS:GO or Dota 2 isn't the same game as it was a few years ago. How games utilize resources over time change. Just like it's silly to say 'wait for games to become multithreaded then Ryzen will kick a$$', it’s also quite naive to assume that with a CPU that is slightly slower today will end up bottlenecking a 3080 Ti tomorrow so bad that it'd make someone wish he'd got the faster CPU back then.

2600Ks and 4790Ks are still going strong you know? The 2600K maybe not so much but 4790K holdouts aren't complaining about their gaming performance when compared against the 'Lakes.

You're basically arguing that those with a slightly slower CPU should feel bad for themselves for their choice of when and what CPU to get.
Precisely my point! Those were the cream of the crop gaming CPUs. Now, how many passed up the opportunity to get these using the "oh, it's only 2 fps faster' argument? And guess what, these CPUs were also the fastest at synthetic, or rather, 'non real world' 480p gaming. If you picked up one of these, you've been in gaming heaven from day one.
 

beginner99

Diamond Member
Jun 2, 2009
4,929
1,292
136
Tbh, I think performance differences due to CPU in gaming are generally negligible. Performance gain in the CPU department (not synthetic benches) was disappointingly small over the past years. Saving some bucks for a bigger GPU is the way to go when concerned about gaming performance. But I guess some people just like to boast about 2 more average FPS in game X by spending a holiday's money on a high performance rig before even considering the GPU.
Because CPU advancement is slow, future-proofing to a certain degree actual works nowadays. GPU performance/$ in fact has been stagnating a lot. 290(x) is still only marginally beat in that metric, 3-years after the fact. Performance went up yes put now you pay almost double for the gaming flagship compared to 5 years ago.

Of course the whole rig matters buying a 8700k and gaming on the iGPU is of course pointless. Also buying a 8700k and gaming at 4k with an RX580 is pointless as well. However the same combo in a 1080p 144 hz monitor makes a lot more sense. All depends on what you want. Still the R7s are kind of a waste in all scenarios as the 1600(x) will do the same thing cheaper. R7 only makes sense if you do a lot of MT stuff besides gaming or you are streaming but let's be honest, that is a small niche.

Posted this in another forum, may as well post it here too:

People tend to upgrade GPUs faster than they upgrade CPUs, and GPUs become obsolete much faster.
I rather spend a bit more on a better gaming CPU which I won't upgrade again for many years than spend more on a GPU that I'll upgrade in just a few years.
Exactly.
 

maddie

Diamond Member
Jul 18, 2010
3,598
2,590
136
This is some seriously flawed logic, which is true only if you're going to be playing the same game year after year. Even a game like CS:GO or Dota 2 isn't the same game as it was a few years ago. How games utilize resources over time change. Just like it's silly to say 'wait for games to become multithreaded then Ryzen will kick a$$', it’s also quite naive to assume that with a CPU that is slightly slower today will end up bottlenecking a 3080 Ti tomorrow so bad that it'd make someone wish he'd got the faster CPU back then.

2600Ks and 4790Ks are still going strong you know? The 2600K maybe not so much but 4790K holdouts aren't complaining about their gaming performance when compared against the 'Lakes.

You're basically arguing that those with a slightly slower CPU should feel bad for themselves for their choice of when and what CPU to get.
Don't you realize by now that we're living in the age of hyperbole. Everyone has to out-exclaim the other by using more outrageous descriptors. This way a couple % points becomes demolish, 110% effort and other idiotic terms.
 

Dufus

Senior member
Sep 20, 2010
675
119
101
Here's what can happen when using an old version of CPUZ that does not fully support the processor it's being run on.

Whether the CPU is running at 800MHz or 5GHz it will only show the base frequency, in this case 2.9GHz.

So take that 3.7GHz with a grain of salt.

MT score is low, maybe some throttling. ST doesn't seem too bad. (IMO)
 

Justinus

Platinum Member
Oct 10, 2005
2,769
877
136
I assumed the poor MT showing was due to either prerelease bios having issues with turbo, conservative turbo boost in the bios, poor cooling solution, bloatware running in the background, or some combination of all these things. 1230 looks like a 3.7/3.8 GHz score when you calculate it based on the 7700K.

We know Intel's chips almost never run at base clocks, even under all core load and especially in the latest generations. I have no doubt 4.3 GHz is an accurate all core boost with adequate cooling, and I have no doubt ~1400 is the correct score at that frequency.

It will be interesting to see if 4.3 all core hits up against the power limit. It won't be a problem for 370z boards with a decent heatsink, but that might have played a role in the HP Omen case, as well.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,012
2,746
136
Precisely my point! Those were the cream of the crop gaming CPUs. Now, how many passed up the opportunity to get these using the "oh, it's only 2 fps faster' argument? And guess what, these CPUs were also the fastest at synthetic, or rather, 'non real world' 480p gaming. If you picked up one of these, you've been in gaming heaven from day one.
Those who passed up the chance to get the cream-of-the-crop CPUs back then, are for the most part, not complaining much at all in general. Once they found that they could get better performance, they saved up money to buy the better CPU. Only a few people who like to do all the assumptions are convinced that those unfortunate enough to not unable to enter the so-called 'gaming heaven' were frustrated with their systems back then. Just like PC culture assumes everyone who isn't white or straight must be oppressed in some way.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,012
2,746
136
I have a simpler explanation for the low score:
Windows 10 update
Edit: I meant this as a joke, but finally looked at the screenshot properly. It's Windows 8. Of course Turbo isn't working properly.
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,573
126
I have a simpler explanation for the low score:
Windows 10 update
Edit: I meant this as a joke, but finally looked at the screenshot properly. It's Windows 8. Of course Turbo isn't working properly.
Turbo Boost 2.0 doesn't work in W8?
 

crashtech

Diamond Member
Jan 4, 2013
9,980
1,709
136
Can we PLEASE stop talking about (defending) Ryzen in this thread!
Do you have a choice of euphemism when discussing relative CPU performance, like "the competition," or "other leading CPUs?" Intel has competition now, so a certain amount comparing and contrasting is inevitable, and part of on-topic discussion, subject to mod intervention.
 
  • Like
Reactions: IEC and Markfw

Phynaz

Lifer
Mar 13, 2006
10,140
817
126
Do you have a choice of euphemism when discussing relative CPU performance, like "the competition," or "other leading CPUs?" Intel has competition now, so a certain amount comparing and contrasting is inevitable, and part of on-topic discussion, subject to mod intervention.
Sure, but there's no need for posts like this one:
You know what I was trying to say, your roundabout manner of speaking that "FX series was a disaster shows that your argument's wrong" doesn't get you any brownie points.

The arguments back then were twofold ~
1) In case of APU, the AMD models had lower price & better IGP going in their favor.
2) For FX it was lower price & OC ability.

I'm sure you witnessed those debates & I'm sure you remember which side I was on, need I remind you what others said in that regard or you for that matter, as opposed to the arguments now rearing up against Zen? Oh & stop deflecting.
That post contains nothing but talk of defending FX. It doesn't belong here.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,573
126
Kaby Lake onward CPUs are only supported in Windows 10 right? TB Max 3.0 needs a driver if I'm not mistaken.
There is no TBM 3.0 on desktop chips, though. It's only on the newer HEDT chips with more than 4 cores, Broadwell E and Skylake X.
8700K only has TB 2.0, just like Sandy Bridge through Kaby Lake.
 

R0H1T

Platinum Member
Jan 12, 2013
2,564
145
106
Sure, but there's no need for posts like this one:


That post contains nothing but talk of defending FX. It doesn't belong here.
While we are at it let's see the context, then let's also see the IDF in action o_O
20% is quite accurate because it is definitely the case in games and even for applications it is not far off if we discount the better SMT scaling from Ryzen and keep in mind Coffeelake will further extend its IPC lead due to the much bigger L3 cache and slightly higher memory support on 6C Coffeelake. Also there is no performant AVX/AVX2 support for Ryzen, therefore Intels IPC lead for x265 is much higher than 20%. There is nothing outrageous in this 20% claim.
I don't see this 20% advantage you speak of, in games, even without Zen's SMT ~

Though to get a clearer picture we need tests at fixed clocks, without that we're basically guesstimating since turbo speeds will skewer the results from either side. You might have a point if we're going strictly stock vs stock.
The Ryzen are doing fine with a 10% clock speed deficit, the SMT does work wonders though, not always but in many cases.

Also no one's offended, but this "my hyperbole is less hyperbolic than your hyperbole" doesn't work for me!
One does not use GPU limited results to determine CPU capability, unless one is profoundly ignorant and/or has a clear agenda.
Right so why would one apply the logic of IPC (20% above Zen according to him) to games, since you're clearly not ignorant &/or agenda driven you would know that CPU is generally a secondary concern in games. Also I asked him how would you determine gaming IPC (by forcing games to run on single core?), maybe you can answer that on his behalf.

P.S. I don't pretend to be unbiased, never have, but it's always fun exposing those who like to wear that cloak.
Did you, the 7700k is not running at fixed clocks? How can you claim this, when you aren't even paying attention to your own statements?


Seriously if any of you are lecturing me on IPC, or clock for clock, then go read a book or find reviews that're actually doing tests at fixed clock, not stock vs stock or whatever!
Do I name other names or shall we play by your rules?
 
Last edited:

wahdangun

Golden Member
Feb 3, 2011
1,007
146
106
... and we've seen the opposite a lot. Compare 1500X to 7500, 7600 and even 7600K:


I still have no idea why people think Ryzens inherently always consume less than Lake CPUs. Even 7900X consumes less than 1920X in the majority of reviews.

P.S. SMT's power hit is rather small; a couple of watts, and is balanced out by i5s higher turbo clocks.
But 1920x have higher performance while consume less power.
 
Last edited:

dwade

Junior Member
Jun 25, 2017
19
24
36
Ah. The Ryzen "good enuff" salesmen team at it again. The prophecy is complete.

Trolling is not allowed
Markfw
Anandtech Moderator
 
Last edited by a moderator:
  • Like
Reactions: pcp7

Dufus

Senior member
Sep 20, 2010
675
119
101
We know Intel's chips almost never run at base clocks, even under all core load and especially in the latest generations. I have no doubt 4.3 GHz is an accurate all core boost with adequate cooling, and I have no doubt ~1400 is the correct score at that frequency.
Or perhaps a little more. The CPU-Z bench seemed a little better 13608/2246. Close to a 5930k at 4.7GHz http://valid.x86.fr/1mmm78 and perhaps some idea of clocks can be had from https://www.pcper.com/reviews/Processors/Intel-Core-i7-7700K-Review-Kaby-Lake-and-14nm/Clock-Clock-Kaby-Lake-Skylake-Broad although CFL is omitted of course.
 

ASK THE COMMUNITY