The 8700k

urvile

Golden Member
Aug 3, 2017
1,575
474
96
I know that the release of the more affordable i9 looms large but a while ago I decided to switch to intel for gaming and dedicate my 1920x to server duties.

The difference between the TR and the intel for gaming is ridiculous. Because I am lazy I used the factory overclock (MCE) which gives me 4.7Ghz on all cores. My RAM is running at 3000Mhz. The 8700k coupled with a gtx 1080 Ti destroys Far Cry 5.

When I was playing with the TR I would get constant jagging even on high. I am currently playing on Ultra @ 3440x1440 with the resolution scale set to 1.2. So good.

For gaming intel is doing everything right. I suspect the consumer level i9 is going to be a gaming monster.

 
  • Like
Reactions: Ottonomous

Ottonomous

Senior member
May 15, 2014
559
292
136
So from what I gather, the 1920X even if aggressively repriced still wouldn't be proper competition for the i9-9900K?

Is an OC'ed 2700X really going to be that helpless in front of it too? I thought the difference was only around 3-5% in 1440p
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
So from what I gather, the 1920X even if aggressively repriced still wouldn't be proper competition for the i9-9900K?

Is an OC'ed 2700X really going to be that helpless in front of it too? I thought the difference was only around 3-5% in 1440p

From the benchmarks I have seen the 2700x goes toe to toe with the 8700k but I think the i9 is going to blow it out of the water. The difference between the 1920x and 8700k however at 1440p is significant. Which is the point of the post.
 
Last edited:

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
What's the actual difference in fps at 1440p?

I can understand the 8700K being a bit faster but by the way you're describing it the 1920X is constantly below 60fps or something?

I don't think the 9900K would be much faster than a 8700K for gaming, the same way a 2700X is barely faster than a 2600X for games - most games simply don't scale that well beyond 6 cores.
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,560
2,551
136
What's the actual difference in fps at 1440p?

I can understand the 8700K being a bit faster but by the way you're describing it the 1920X is constantly below 60fps or something?

I don't think the 9900K would be much faster than a 8700K for gaming, the same way a 2700X is barely faster than a 2600X for games - most games simply don't scale that well beyond 6 cores.

Something the 9900k will have in relation to the 8700k that the 2700x doesn't have on the 2600x is larger L3. The bump from 12->16MB may help quite a lot. My 5960x quite often beats out scores I see from people with 8700k's and it ain't the IPC or quad channel memory. It's the giant fast 20MB L3 cache (overclocked like mad, of course.)
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Some of the comparisons between the 8K and 9K series chips will be interesting with HT gone from most of them and some cache differences.

8700K 6C/12T/12mb 3.7/4.7
9700K 8C/8T/12mb 3.6/4.9
 

rchunter

Senior member
Feb 26, 2015
933
72
91
Yeah, i'm anxious to see 9900k reviews. That could quite possibly be my next cpu.
 

DeadlyTitan

Member
Oct 20, 2017
144
11
41
I know that the release of the more affordable i9 looms large but a while ago I decided to switch to intel for gaming and dedicate my 1920x to server duties.

The difference between the TR and the intel for gaming is ridiculous. Because I am lazy I used the factory overclock (MCE) which gives me 4.7Ghz on all cores. My RAM is running at 3000Mhz. The 8700k coupled with a gtx 1080 Ti destroys Far Cry 5.

When I was playing with the TR I would get constant jagging even on high. I am currently playing on Ultra @ 3440x1440 with the resolution scale set to 1.2. So good.

For gaming intel is doing everything right. I suspect the consumer level i9 is going to be a gaming monster.


@3440x1440 with 1.2 resolution scale, i dont think its the CPU's problem anymore if at all any.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,378
126
I don't think it's really controversial that the 8700k is a top gaming CPU. Or that the high core count Ryzens are just awesome all around and can be a better option depending on use case.

I had a similar experience, 1800x to 8700k.

One of the other things I think most people forget, is that the gap that looks more narrow NOW at 1440p/4k actually gets bigger with time IF the 1080p benches show it. The reason is that resolution has very very little to do with CPU, but GPU.

Now if you constantly upgrade CPUs, this doesn't matter as much. But for someone that holds those for a bit while upgrading GPUs a few times, it could make a difference. For example, think back to Radeon 6970/GTX 680 days. A 3770k and FX 8350 looked pretty similar at higher resolutions, because the GPU was simply taxed out there at ultra. 1080 and below the Intel was faster for gaming. Fast forward to GTX 980/RX480/etc. Now of course the FX 8350 falls behind the 3770k in gaming with any GPU fast enough to show it, to the point where a 3770k will often bottleneck a 1080ti.

By the time GTX 12xx/etc are out, I expect the gap between the 8700k and 2700x to grow larger in Gaming at 1440/4k, not smaller, in most cases. Exceptions could apply if there are really good advances in high core count optimizations, but it still seems all these years later that max IPC + clock for 4-8 cores beats more cores but lesser IPC+clock. Of course the final thing here is that stock 2700x v stock 8700k the gap is smaller than max Air OC 2700x v 8700k, it just pushes more gaming advantage to 8700k both short and long term.

For a mid-range gamer at 1440 or 4k, who might only have a GTX 1260 in three years time, this would be far less noticeable probably.

And still for a general user where gaming is not a serious primary use case, I think Ryzen's socket consistency and much broader upgrade paths in combination with just outstanding overall performance, are a better buy.

These examples of the strengths of each will probably remain similar until the next massive architecture shift occurs.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,901
1,134
126
I don't think the 9900k will "blow out the water" the 8700k in gaming. Hardly any games make use of 8 cores, and the 8700k may only have 6 cores but they are very strong cores. 9900k will crush the 8700k in overall performance but in gaming I'm expecting maybe 1 to 2%.

If gaming is your main thing then the 8700k and 9700k are better choices than the 9900k which will command a premium price since it is an "i9"
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
What's the actual difference in fps at 1440p?

I can understand the 8700K being a bit faster but by the way you're describing it the 1920X is constantly below 60fps or something?

I don't think the 9900K would be much faster than a 8700K for gaming, the same way a 2700X is barely faster than a 2600X for games - most games simply don't scale that well beyond 6 cores.

With the same settings. The 1920x doesn't go above 60. I ended up having to play on high with a standard ratio. My experience with 1920x as a gaming CPU has been an odd one. I overclocked it to 4.1ghz (for which I needed a custom loop due to the heat) it didn't make a whole lot of difference. When I play I get jagging even when the fps doesn't drop below 60 and it's not just in far cry. Basically I got sick of it and built an intel rig for gaming.

I am not saying the AMD consumer ryzen is bad. I have seen the 2700x benchmarks and it is sweet. However based on my experience first gen TR is crap for gaming. YMMV.
 

MrSquished

Lifer
Jan 14, 2013
23,645
21,856
136
How hot does yours run at 4.7?

Mine is running quite toasty - during intense gaming sessions temps go into the 80's. I went back and re-seated the heatsink with a tried and true method and still the same temps. I can't even run prime95 without getting too hot for my comfort.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
How hot does yours run at 4.7?

Mine is running quite toasty - during intense gaming sessions temps go into the 80's. I went back and re-seated the heatsink with a tried and true method and still the same temps. I can't even run prime95 without getting too hot for my comfort.

What cooler?
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
How hot does yours run at 4.7?

Mine is running quite toasty - during intense gaming sessions temps go into the 80's. I went back and re-seated the heatsink with a tried and true method and still the same temps. I can't even run prime95 without getting too hot for my comfort.

Around the same temps but only under intense loads. The hottest I have seen it go is 83c when I was rendering video using Handbrake. Which I don't see as an issue. It's nowhere near hot enough for there to be an issue like throttling and games just spike the temperature. It's not continuous.
 

slashy16

Member
Mar 24, 2017
151
59
71
I upgraded from an Intel 6600k to a Ryzen 2700x because I wanted to do other tasks with my gaming rig. I have noticed some games run very poorly on ryzen especially if they are single threaded.
Dibalo3 for instance runs poorly when you run higher rifts with lots of spam on screen. I have found turning SMT off helps in a a couple titles like diablo.
 

Dasa2

Senior member
Nov 22, 2014
245
29
91
I suspect the consumer level i9 is going to be a gaming monster.
My RAM is running at 3000Mhz.
Honesty upgrading your ram to a higher speed will likely make a bigger difference than the i9 to gaming not that ether would be worthwhile.
The small amount of extra l3 cache on the i9 would be insignificant it really needs 128MB+ to see decent gains.
Extra cores wont help many games the only thing that will matter to most games is a increased clock speed which will only bring a few % and probably no better than a overclocked 8700k.

What ram were you running with the 1920x? for it to be performing that bad its almost like you only had two sticks or something it should only be about 25% slower in CPU bound sections of gameplay.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,778
14,755
136
Dibalo3 for instance runs poorly when you run higher rifts with lots of spam on screen. I have found turning SMT off helps in a a couple titles like diablo.
Diablo3 higher rifts issues are mostly server side bound, players have been complaining about it for ages.

The game runs fine even on lower end systems as long as your network connection is good/excellent. I had no issue running D3 on a sub 3Ghz Haswell chip, with high rifts delays/freeze being the only problem, but that was always experienced by all members of the party, the AoE effects slow server response down to a crawl. Even the hour of the day makes a difference sometimes.
 
  • Like
Reactions: scannall

urvile

Golden Member
Aug 3, 2017
1,575
474
96
How hot does yours run at 4.7?

Mine is running quite toasty - during intense gaming sessions temps go into the 80's. I went back and re-seated the heatsink with a tried and true method and still the same temps. I can't even run prime95 without getting too hot for my comfort.

After playing Far Cry 5 for about an hour these are the temps I get. I am using a be quiet! silent loop with a 280mm radiator.

Z2T92LT.jpg
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Honesty upgrading your ram to a higher speed will likely make a bigger difference than the i9 to gaming not that ether would be worthwhile.
The small amount of extra l3 cache on the i9 would be insignificant it really needs 128MB+ to see decent gains.
Extra cores wont help many games the only thing that will matter to most games is a increased clock speed which will only bring a few % and probably no better than a overclocked 8700k.

What ram were you running with the 1920x? for it to be performing that bad its almost like you only had two sticks or something it should only be about 25% slower in CPU bound sections of gameplay.

The exact same RAM as I am using now because I cannibalised two sticks of it. The TR was using 4 sticks (32GB) in quad channel @ 3000Mhz. I know what I saw. The performance was so bad I spent the money to build another system. It's a crap hot server though.

It was the constant jagging that did it even when the fps wasn't going under 60. I have seen the CPU load go as high as 80% and staying there on the TR while playing Far Cry 5. With games running Vulkan like the new wolfenstein series it was rare to see the fps go below 80 with everything maxed out.

I would be interested in hearing from anyone who has first hand experience gaming with the 1920x @ 3440x1440 with an overclocked GTX1080Ti.
 

Dasa2

Senior member
Nov 22, 2014
245
29
91
I would be interested in hearing from anyone who has first hand experience gaming with the 1920x @ 3440x1440 with an overclocked GTX1080Ti.
Shouldn't matter what res there running as provided the section of gameplay is CPU limited the FPS will be the same whether its running 720p or 4k.
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Shouldn't matter what res there running as provided the section of gameplay is CPU limited the FPS will be the same whether its running 720p or 4k.

You should tell me how my PC is performing.
 

Dasa2

Senior member
Nov 22, 2014
245
29
91
You should tell me how my PC is performing.
Well if I had farcry 5 I could probably show my 6700k outperforming your 8700k at a lower res but my 1070 would get pawned by your 1080ti at higher res.
Unfortunately I don't have any AMD CPU to play with. But I did find this video with a 1950x at stock with 2133 ram vs 4GHz with 3200 ram https://www.youtube.com/watch?v=8yqIvn6YyCQ
There seems to be a lot of complaints about CPU bottlnecks in farcry5 from both Intel 299x users and AMD
So I don't doubt that there is more demanding parts of the game than whats shown in that video.