Speculation: i9-9900K is Intel's last hurrah in gaming

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Will Intel lose it's gaming CPU lead in 2019?


  • Total voters
    177
  • This poll will close: .

scannall

Golden Member
Jan 1, 2012
1,386
229
136
I'll throw my 2 cents in. The 9900K is a beast, and will continue to be MORE than enough for gaming for several years to come. Regardless of whatever AMD comes out with. I do think AMD will edge out Intel here in the short term. Intel has a LOT of structural problems right now. That should be apparent to pretty much everyone.

But to suggest that Intel will have these issues forever is short sighted. Do they have a lot of work to do? Yep, sure do. Just digging out of the Meltdown hole is going to take a lot of resources. My gut feeling tells me they need a new from scratch architecture to really get out of that pit. But they do have the talent, if they can get their management in order. Something shareholders will start demanding at some point.

Long term? Yeah, they will get it figured out. These are very long cycle times, and AMD coming back hard was a slap in the face and hopefully a wake up call. It's just good to see actual competition in that space. AMD's construction cores let Intel coast for a long time, and it's taking them a while to get back into gear.
 
Apr 27, 2000
11,960
1,097
126
the virtualized server, I cannot emphasize enough how big a
Okay, this is what multiquote gave me. Why, multiquote? Why?

Anyway, all I was going to say is: a lot of license-based software for big iron is moving away from per-socket licensing and towards per-core. Not that that concerns the 9900k or gaming. So not sure how we got there.

remember consoles are getting 8 zen 2 cores probably high 2ghz clock maybe even low 3ghz. 8 strong cores is the new peasant level , to be real PC master race its 12 or up. I expect Ryzen1 release pricing but being 16/12/8 instead of 8/6/4
Yeah I don't know about that. First off, PS5 is supposed to turbo up to 3.2 GHz (1.6 GHz base clock). Even with decent cooling that thing will all-core turbo up in the 3 GHz range and stay within a very low TDP. BUT 8 cores @ 5 GHz won't be "peasant level", sorry to say. The real problem isn't the 9900k's performance, it's a). the price and b). the power consumption. Over time, it's gonna be like owning an FX-9590. Hot and not necessarily that much faster than other newer stuff on the market. 10c Comet Lake won't improve the situation much.

If Intel are now recommending people turn off hyper threading then will the 9900K be tested using Intel's recommended settings going forward?
That's only for people who are seriously concerned about certain security vulnerabilities. Most people with 9900ks are not in that crowd. If you want to see what a 9900k w/out HT will be like, check out 9700k benches.

If it's about gaming alone then no they don't need that, neither AMD nor 10th gen,right now anything from a 7700k and upwards has roughly the same results in gaming because we just lack the GPU that would show any difference and no benchmark shows the actual usage of all threads to see how much performance there is left on the table.
Zen 2 or 10th gen can match clocks and "IPC" (cine results) all they want they won't be able to make GPUs any faster than they are right now.
The 2080Ti has enough grunt at 1440p to make CPUs matter again. Currently, the 9900k shines in those situations. It will lose some of its lustre soon. Comet Lake absolutely won't help there.

There are also some oddball games like, I don't know, Kerbel Space Program or what have you that soak up CPU time like a beast. If I were a real fanboy of one of those games, or maybe something Paradox had put out (for the inevitable slowdown that comes on later turns; see Stellaris) then I'd be looking at a 9900k were I shopping today with no eye to the future.

I'll throw my 2 cents in. The 9900K is a beast, and will continue to be MORE than enough for gaming for several years to come. Regardless of whatever AMD comes out with. I do think AMD will edge out Intel here in the short term. Intel has a LOT of structural problems right now. That should be apparent to pretty much everyone.
The 9900k will be in the middle of the pack pretty soon. Once Zen3 comes out, it (and Comet Lake) will take a back seat. Sadly, I do not think us desktop dwellers will get anything better from Intel on the desktop for awhile. They may bag on the desktop after 10c Comet Lake until 7nm is ready.
 

TheELF

Platinum Member
Dec 22, 2012
2,852
122
126
The 2080Ti has enough grunt at 1440p to make CPUs matter again. Currently, the 9900k shines in those situations. It will lose some of its lustre soon. Comet Lake absolutely won't help there.
Does it though?
In practice you won't notice any difference between the 7700k and the 9900k and even in bench results it's too close to really matter.

There are also some oddball games like, I don't know, Kerbel Space Program or what have you that soak up CPU time like a beast. If I were a real fanboy of one of those games, or maybe something Paradox had put out (for the inevitable slowdown that comes on later turns; see Stellaris) then I'd be looking at a 9900k were I shopping today with no eye to the future.
Yeah I'm sure that's what everybody is talking about when they say that ryzen is close to intel in gaming.....
 
Apr 27, 2000
11,960
1,097
126
Does it though?
In practice you won't notice any difference between the 7700k and the 9900k and even in bench results it's too close to really matter.
Those minimums though. That's where you notice it. Hell even the 2700x would be a better choice than the 7700k due to the 1% values.

Yeah I'm sure that's what everybody is talking about when they say that ryzen is close to intel in gaming.....
It's the opposite. If you're playing Kerbal or . . . I don't know, one of the Total War games, you really want that Intel chip in your system, overclocked to hell and back. That Battlefield V bench you pasted above is where people are looking more seriously at the 2700x as a contender. 82 minfps vs 94? The 9900k isn't that much better.
 
Feb 6, 2011
1,827
184
136
Yeah I don't know about that. First off, PS5 is supposed to turbo up to 3.2 GHz (1.6 GHz base clock). Even with decent cooling that thing will all-core turbo up in the 3 GHz range and stay within a very low TDP. BUT 8 cores @ 5 GHz won't be "peasant level", sorry to say. The real problem isn't the 9900k's performance, it's a). the price and b). the power consumption. Over time, it's gonna be like owning an FX-9590. Hot and not necessarily that much faster than other newer stuff on the market. 10c Comet Lake won't improve the situation much.
Nothing has been given on PS5 clock speed, only that its 8 zen 2 cores*. Mark cerny has said that the PS5 will be at a higher price point then PS4 because they are being more power/performance (relatively speaking) . There is no way they are going to clock the Core so low and consoles never turbo, they need deterministic performance. Jaguars is clocked low because they can clock high, they never got past 2.5ghz no matter to platform, even the stoney ridge gets about the same clock per core per watt on approx the same process.

The PS4 pro and X1X have 2.1-2.3 ghz clocks, Zen let alone Zen2 is more efficient and then we have the power drop from 7nm, clocks are going to be mid to high 2's.

Also only double CPU performance of a console is still pretty pesent, just look at history, Xenos , cell, jaguar have all been a long way behind the equivalent to of 9900k of the time.

*i wouldn't take an ES of Gonzalo as anything definitive, only what Mark has said.
 

TheELF

Platinum Member
Dec 22, 2012
2,852
122
126
Those minimums though. That's where you notice it. Hell even the 2700x would be a better choice than the 7700k due to the 1% values.
82 vs 78 ???Seriously now?
 

TheELF

Platinum Member
Dec 22, 2012
2,852
122
126
82 minfps vs 94? The 9900k isn't that much better.
Hello? ... That's my whole point,anything from the 7700k upwards is pretty much on the same level even with a 2080ti.
 
Apr 27, 2000
11,960
1,097
126
82 vs 78 ???Seriously now?
Thanks for twisting the point. Look at average FPS and then look at minfps. 2700x is lower in average but higher in minfps. So 7700k vs 2700x, which do you pick? The correct answer is, of course, 9900k. The 2700x is only in the mix because of its ability to keep gameplay smooth. And it does this better than the 7700k. Frankly I do not think quadcore CPUs belong in the conversation anymore when it comes to high fps gaming. The 2700x shouldn't be there either, and yet, there it is.

If all you want is 60 fps then, whatever.

Back to the point of a 2080Ti being able to make a fast CPU (like the 9900k) shine:

https://www.pcgamesn.com/intel-i9-9900k-review-benchmarks

Check out the 2080Ti section. Far Cry 5 is a big win for the 9900k @ 1080p, bringing the chip dangerously close to 144 MHz minfps (which is the holy grail for those of us who actually want more than 60 fps). In Warhammer 2, the 9900k can keep the game above 60 fps while the 2700x can't (which is embarrassing for the 2700x. Oh well). And in Civ6, 9900k has a pretty healthy lead of 23 fps in the minimum department. So yes, CPUs can still make a difference, just not at 4K.
 
Apr 27, 2000
11,960
1,097
126
Oct 27, 2006
19,723
201
106
If it's about gaming alone then no they don't need that, neither AMD nor 10th gen,right now anything from a 7700k and upwards has roughly the same results in gaming because we just lack the GPU that would show any difference and no benchmark shows the actual usage of all threads to see how much performance there is left on the table.
Zen 2 or 10th gen can match clocks and "IPC" (cine results) all they want they won't be able to make GPUs any faster than they are right now.
I largely agree, but there is a caveat or sorts. Not everyone plays games at total max AA/Ultra like typical benchmarks. And chasing 144/165 and especially 244hz is tough. It's one area where Intel's current CL lineup up to 9900k excels at, but there's still more on the table :)
 

B-Riz

Golden Member
Feb 15, 2011
1,034
146
106
Thanks for twisting the point. Look at average FPS and then look at minfps. 2700x is lower in average but higher in minfps. So 7700k vs 2700x, which do you pick? The correct answer is, of course, 9900k. The 2700x is only in the mix because of its ability to keep gameplay smooth. And it does this better than the 7700k. Frankly I do not think quadcore CPUs belong in the conversation anymore when it comes to high fps gaming. The 2700x shouldn't be there either, and yet, there it is.

If all you want is 60 fps then, whatever.

Back to the point of a 2080Ti being able to make a fast CPU (like the 9900k) shine:

https://www.pcgamesn.com/intel-i9-9900k-review-benchmarks

Check out the 2080Ti section. Far Cry 5 is a big win for the 9900k @ 1080p, bringing the chip dangerously close to 144 MHz minfps (which is the holy grail for those of us who actually want more than 60 fps). In Warhammer 2, the 9900k can keep the game above 60 fps while the 2700x can't (which is embarrassing for the 2700x. Oh well). And in Civ6, 9900k has a pretty healthy lead of 23 fps in the minimum department. So yes, CPUs can still make a difference, just not at 4K.
Really, the only competition for the 9900K *right now* is a delidded and overclocked 8700K.

1557932502456.png
 

TheELF

Platinum Member
Dec 22, 2012
2,852
122
126
I largely agree, but there is a caveat or sorts. Not everyone plays games at total max AA/Ultra like typical benchmarks. And chasing 144/165 and especially 244hz is tough. It's one area where Intel's current CL lineup up to 9900k excels at, but there's still more on the table :)
Nor do that many people even have a 2080ti or even a 1080.

The thing is that most people don't accept anything below ultra settings because supposedly ultra settings affect CPU performance,most people don't even accept 1080p any more because it's "not relevant" anymore.
Yes the 9900k is much faster my point is exactly that,but you can't see it because there are no benchmarks that can show it because the 2080ti at ultra is not strong enough even at 1080 and even that is considered to not be relevant.
 
Apr 27, 2000
11,960
1,097
126
Really, the only competition for the 9900K *right now* is a delidded and overclocked 8700K.
That highlights my point even further. The 7700k isn't close to the 9900k. It's a shame they didn't rerun that @ 1440p, but I have a feeling the 9900k would have won by at least 10-15% on minfps. And once again the "good enough" 7700k loses the minfps contest to Pinnacle Ridge. This time it's the standard 2700.
 

TheELF

Platinum Member
Dec 22, 2012
2,852
122
126
That highlights my point even further. The 7700k isn't close to the 9900k. It's a shame they didn't rerun that @ 1440p, but I have a feeling the 9900k would have won by at least 10-15% on minfps. And once again the "good enough" 7700k loses the minfps contest to Pinnacle Ridge. This time it's the standard 2700.
Did you see the O/C results of the 7700k?
Also yes it's the standard 2700 but also O/C to the max 4.2.
They have the same min while the i7 kicks ryzens butt at avg.
Another also,it's 1080 at normal settings...far far away from your original statement.
The 2080Ti has enough grunt at 1440p to make CPUs matter again. Currently, the 9900k shines in those situations. It will lose some of its lustre soon. Comet Lake absolutely won't help there.
 

JDG1980

Golden Member
Jul 18, 2013
1,647
147
136
That's only for people who are seriously concerned about certain security vulnerabilities. Most people with 9900ks are not in that crowd. If you want to see what a 9900k w/out HT will be like, check out 9700k benches.
They'll start caring pretty quickly when/if JavaScript-based vulnerabilities show up. If we're talking about just running malicious native code, sure, for desktop enthusiasts, "don't do that" is a perfectly reasonable solution. But if JavaScript can break out of its sandbox or read your private data, all bets are off - every webpage you visit becomes a potential exploit against your system.

The 9900k will be in the middle of the pack pretty soon. Once Zen3 comes out, it (and Comet Lake) will take a back seat. Sadly, I do not think us desktop dwellers will get anything better from Intel on the desktop for awhile. They may bag on the desktop after 10c Comet Lake until 7nm is ready.
I'm honestly not sure that anything in the x86 world, from either vendor, will substantially exceed the 9900K in terms of single-threaded performance in the foreseeable future. The 9900K has both excellent IPC and an incredibly high clock rate. Icelake and its successors will obviously have some IPC improvements over the Skylake family, and there's a good chance Zen 2 will match or beat Skylake IPC as well, but it's questionable whether TSMC 7nm or Intel 10nm will be able to reach the same 5GHz clock rate as Intel 14nm+++. The only reason Intel got that high is because they were stuck on the same process for so long that they got it about as mature and optimized as any process possibly could be. It certainly wasn't possible on first-gen Skylake. If 10nm can only get to 4.5GHz, they need a 10% IPC improvement just to keep single-thread performance constant. If it can't get that high (and it might not be able to) then they will almost certainly fall short of 9900K performance in single-threaded benchmarks.

There is only so much single-threaded performance that can be squeezed out of the x86 architecture, and I think the 9900K is pretty close to the limit.
 

Thunder 57

Senior member
Aug 19, 2007
679
182
136
They'll start caring pretty quickly when/if JavaScript-based vulnerabilities show up. If we're talking about just running malicious native code, sure, for desktop enthusiasts, "don't do that" is a perfectly reasonable solution. But if JavaScript can break out of its sandbox or read your private data, all bets are off - every webpage you visit becomes a potential exploit against your system.



I'm honestly not sure that anything in the x86 world, from either vendor, will substantially exceed the 9900K in terms of single-threaded performance in the foreseeable future. The 9900K has both excellent IPC and an incredibly high clock rate. Icelake and its successors will obviously have some IPC improvements over the Skylake family, and there's a good chance Zen 2 will match or beat Skylake IPC as well, but it's questionable whether TSMC 7nm or Intel 10nm will be able to reach the same 5GHz clock rate as Intel 14nm+++. The only reason Intel got that high is because they were stuck on the same process for so long that they got it about as mature and optimized as any process possibly could be. It certainly wasn't possible on first-gen Skylake. If 10nm can only get to 4.5GHz, they need a 10% IPC improvement just to keep single-thread performance constant. If it can't get that high (and it might not be able to) then they will almost certainly fall short of 9900K performance in single-threaded benchmarks.

There is only so much single-threaded performance that can be squeezed out of the x86 architecture, and I think the 9900K is pretty close to the limit.
I agree with everything up until your last sentence. There have always been new technologies, materials, and ideas to advance performance. It may take some time but eventually we will have more x86 performance. At least we have honest competition again to push each other do do better. A few more years of that and who knows what we might see?
 
Apr 27, 2000
11,960
1,097
126
Did you see the O/C results of the 7700k?
I ignored them, and looked at the stock performance for the 7700k and 2700 instead.

7700k = 78.7 minfps
2700 = 80.9 minfps

If I have to pick one or the other, and if the 9900k (or 2700x, or 8600k, or whatever) is not available, 2700 all the way.

Also yes it's the standard 2700 but also O/C to the max 4.2.
It says "stock". That's not 4.2 GHz.

Another also,it's 1080 at normal settings...far far away from your original statement.
You really think the minfps situation will be any better at 1440p for the 7700k? And I did say I wish they'd run those benches at some other resolution. Regardless, I would expect the 9900k to continue to dominate at 1440p medium. Maybe less so at 1440p ultra/max settings.

They'll start caring pretty quickly when/if JavaScript-based vulnerabilities show up.
Apparently, the latest MDS flaw is noticeably more difficult to exploit than Meltdown or Spectre. I didn't see anything that explicitly said that the attacker would require local access to the machine, buuuuut it would take a lot of work to weaponize the payload. Ideal perhaps for limited security breaches against specific targets. If some yutz who likes to click on spearphising emails at work has a 9900k for an office PC then I might be worried.

There is only so much single-threaded performance that can be squeezed out of the x86 architecture, and I think the 9900K is pretty close to the limit.
Not necessarily true. Intel already indicates that they can do better per clock with not one but two cores (Sunny Cove and Willow Cove). If they had better progress with their fabs, then we'd see those cores in action sooner than later, at the same clocks that we can get from 14nm++++++ today. That's gonna take time.
 

beginner99

Diamond Member
Jun 2, 2009
4,129
197
126
They'll start caring pretty quickly when/if JavaScript-based vulnerabilities show up. If we're talking about just running malicious native code, sure, for desktop enthusiasts, "don't do that" is a perfectly reasonable solution. But if JavaScript can break out of its sandbox or read your private data, all bets are off - every webpage you visit becomes a potential exploit against your system.
AFAIK these vulnerabilites all rely on timing and the solution was that browsers simply reduced the timing granularity available in JS and hence all these attacks are not possible anymore regardless of hardware fixes. That's why the issue is overblown anyway. It matters if you are a cloud provider but not for your average joe desktop.
 

moinmoin

Senior member
Jun 1, 2017
798
305
106
It matters if you are a cloud provider but not for your average joe desktop.
I'd say the vulnerabilities do make sandboxing on average joe desktop pointless. And nowadays everybody seems to be eager to do sandboxing even on on average joe's desktop.
 

sxr7171

Diamond Member
Jun 21, 2002
5,066
5
91
You have a 8700k, the second fastest CPU on the planet, I don t see a reason why you need to upgrade anyhow.
Everybody with any sense knows that a 6 core Intel CPU at high clocks is fast enough for 99% of people out there.
No need for 10,12,or 16 cores ,it's a total waste of money/performance for 98% of us.
High frequency/IPC is where its at. Might need 8 cores in 2021.

You’re right. But why is Shadow of the Tomb Raider CPU render limiting for me? I’ve run it many times and it always has parts where the frame time is limited by the CPU.
 

JDG1980

Golden Member
Jul 18, 2013
1,647
147
136
AFAIK these vulnerabilites all rely on timing and the solution was that browsers simply reduced the timing granularity available in JS and hence all these attacks are not possible anymore regardless of hardware fixes. That's why the issue is overblown anyway. It matters if you are a cloud provider but not for your average joe desktop.
It could still be a problem if you use a VM to isolate sketchy software, which some enthusiasts and researchers do.
 
Aug 25, 2001
43,833
609
126
That gives me an idea... CPUs, engraved on the heatspreader with professional sports teams logos. Sell them for an extra $50. Easy money!
 

Similar threads



ASK THE COMMUNITY

TRENDING THREADS