i9 9900X vs AMD Threadripper 2920X

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Heclone

Junior Member
Dec 7, 2018
19
11
51
@Heclone

Zen2 isn't out yet. The OP mentioned two specific chips, so we've had to recommend one or the other. I try to stay on-topic.

If he's willing to wait, I would recommend he NOT invest in X470 but instead get an entirely new motherboard and chip when Zen2 launches. X570 should offer a much better user experience for Zen2-based chips. The one weakness of AMD's AM4 strategy appears to be that new AGESA versions are hard(er) to roll out for old AM4 boards. The OEMs have less incentive to do it, so they drag their feet. I figure that a proper UEFI for Zen2 will be out by May/June, but probably not on day one. There will be some half-arsed compatibility updates. Also rolling out new AGESA versions for X370/X470 boards may make older chips work less . . . well on those boards. My 1800x has restricted memory speed thanks to newer AGESA versions for my X370 Taichi. My best UEFI rev was 3.30 but that didn't have all the Spectre fixes sooooo meh.

Zen2 may shake up the entire desktop PC market, making both the 9900k and some TR parts irrelevant.

If someone is looking for help to make a decision, knowing other possibilities can be interesting I think (plus he said that all perspectives are welcomed). If I have sticked with my starting rig project a few years ago I would have ended up with some crappy unbalanced rig until right now. (Not saying the base options here are non-sense or crap, it's just to make my point).

The suggestion was about starting to get a new rig right now but having the possibility to quickly upgrade. I agree with you that if he simply wants to wait, picking a brand new X570 MB would make more sense, there might be some X570 only features.

Anyway it's up to ChripsyjBMW !
 
  • Like
Reactions: ChrispyjBMW

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
I beg to differ with you, I don't think it will be noticeable outside of benchmarks. Got a link to share ?

What is the point of 144Hz gaming monitors then? At what point do we declare a difference 'noticeable'? Of course that is subjective. Maybe the difference between 120 and 150fps isn't noticeable, but what about 70fps vs 100fps min fps, is that noticeable? Just using BF:V as an example:
https://www.techspot.com/review/1754-battlefield-5-cpu-multiplayer-bench/

9900K 1080P: 151 avg / 100 min
9900K 1440P: 142 avg / 98 min
2920X 1080P: 118 avg / 74 min

No 1440P results for the 2920X (the article is focused on 2700X vs 9900K) but based on the 2700X 1080P vs 1440P result there is about a ~5% drop in min fps, so we're talking about ~70fps for the 2920X.

98fps vs 70fps? That's a 40% difference, and I'd argue that is probably noticeable for a seasoned gamer, especially in a twitch fps game like Battlefield. If I told the OP he can pay $530 for the 9900K and get 100fps mins, or $650 for 70fps mins, what is the smart choice here?

I should mention that these are results with a 2080 Ti, so a lower end GPU will indeed be more GPU limited especially at 1440P ultra settings. But a simple adjustment to 'high' instead of 'ultra' and we are back to these type of differences between the difference CPUs.
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,494
136
What is the point of 144Hz gaming monitors then? At what point do we declare a difference 'noticeable'? Of course that is subjective. Maybe the difference between 120 and 150fps isn't noticeable, but what about 70fps vs 100fps min fps, is that noticeable? Just using BF:V as an example:
https://www.techspot.com/review/1754-battlefield-5-cpu-multiplayer-bench/

9900K 1080P: 151 avg / 100 min
9900K 1440P: 142 avg / 98 min
2920X 1080P: 118 avg / 74 min

No 1440P results for the 2920X (the article is focused on 2700X vs 9900K) but based on the 2700X 1080P vs 1440P result there is about a ~5% drop in min fps, so we're talking about ~70fps for the 2920X.

98fps vs 70fps? That's a 40% difference, and I'd argue that is probably noticeable for a seasoned gamer, especially in a twitch fps game like Battlefield. If I told the OP he can pay $530 for the 9900K and get 100fps mins, or $650 for 70fps mins, what is the smart choice here?
Here is the sentence that best describes my feeling on this subject (from your own link) "As we become GPU limited, at 1440p we see very little difference between the 2700X and 9900K using the GTX 1070, RTX 2070 and even the RTX 2080. "
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Here is the sentence that best describes my feeling on this subject (from your own link) "As we become GPU limited, at 1440p we see very little difference between the 2700X and 9900K using the GTX 1070, RTX 2070 and even the RTX 2080. "

That only applies if you strictly run ultra details at 1440P. An RTX 2080 owner can simply run at 'high' details instead of 'ultra' and we are back to the same level of CPU bottleneck as a 2080 Ti at 'ultra'.

Also, a 2700X isn't a 2920X. The 2920X performs 10 - 15% worse than the 2700X in BFV.

2700X: 130 / 85
2920X: 118 / 74

With the 2920X, you are limited to 70 - 75fps min fps REGARDLESS of the resolution and graphics settings. With the 9900K, whether you game at 1080P or 1440P, the ceiling is ~100fps. THAT sums up my feelings on this subject.
 
  • Like
Reactions: ChrispyjBMW

DrMrLordX

Lifer
Apr 27, 2000
21,610
10,804
136
If someone is looking for help to make a decision, knowing other possibilities can be interesting I think (plus he said that all perspectives are welcomed).

I am pretty biased in favor of AMD, so if I offered other possibilities, I would probably spend every thread trying to convince people to buy AMD hardware. We've had people do that in the past which has lead to some ugly flamewars and repetitive off-topic posting. It got to be a bit of a bad meme for awhile, with some AMD fanatics trying to convince people that various Piledriver CPUs were actually not that bad. Page after page of hypothetical use cases, etc. I don't miss that.

So if the OP wants to know 9900k vs 2920x, I'll give my opinion on those two and be done with it, even when we are only about 5-6 months away from a major upheaval in the market.

98fps vs 70fps? That's a 40% difference, and I'd argue that is probably noticeable for a seasoned gamer, especially in a twitch fps game like Battlefield. If I told the OP he can pay $530 for the 9900K and get 100fps mins, or $650 for 70fps mins, what is the smart choice here?

I will notice that difference the most when I have my fps counter up. That's why I keep up my fps counter. If I buy a 144 MHz monitor, I want to get as close to 144 fps as I can, in everything. Haven't gotten there yet. My eyes might not tell the difference that much, but the counter reminds me of what's really going on. That's just me.

For point-of-reference, I play DQXI a fair bit (on my second playthrough). My 1800x @ 4.0 GHz + VegaFE (overclocked, somewhat) get me a minfps of about 70. It is not a twitch fps title, but I'm still watching the fps counter anyway. Someday I'll play an fps that isn't TF2 or whatever (lulz) so fps may matter more then. But I'm still watching those framerates on everything I play. If I had a 9900k under the hood, those minfps would probably be higher. If AMD would sell me a faster video card, then the Zen2 I'm going to get next year plus that card would definitely put my minfps higher. I can't logically say, "well it's okay right now, my 1800x is enough CPU for this game" when in truth I know it isn't. I set a goal, and I'm not yet there. Even if it doesn't make a difference today in that one game I am playing, it will eventually.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
I am pretty biased in favor of AMD, so if I offered other possibilities, I would probably spend every thread trying to convince people to buy AMD hardware. We've had people do that in the past which has lead to some ugly flamewars and repetitive off-topic posting. It got to be a bit of a bad meme for awhile, with some AMD fanatics trying to convince people that various Piledriver CPUs were actually not that bad. Page after page of hypothetical use cases, etc. I don't miss that.

So if the OP wants to know 9900k vs 2920x, I'll give my opinion on those two and be done with it, even when we are only about 5-6 months away from a major upheaval in the market.



I will notice that difference the most when I have my fps counter up. That's why I keep up my fps counter. If I buy a 144 MHz monitor, I want to get as close to 144 fps as I can, in everything. Haven't gotten there yet. My eyes might not tell the difference that much, but the counter reminds me of what's really going on. That's just me.

For point-of-reference, I play DQXI a fair bit (on my second playthrough). My 1800x @ 4.0 GHz + VegaFE (overclocked, somewhat) get me a minfps of about 70. It is not a twitch fps title, but I'm still watching the fps counter anyway. Someday I'll play an fps that isn't TF2 or whatever (lulz) so fps may matter more then. But I'm still watching those framerates on everything I play. If I had a 9900k under the hood, those minfps would probably be higher. If AMD would sell me a faster video card, then the Zen2 I'm going to get next year plus that card would definitely put my minfps higher. I can't logically say, "well it's okay right now, my 1800x is enough CPU for this game" when in truth I know it isn't. I set a goal, and I'm not yet there. Even if it doesn't make a difference today in that one game I am playing, it will eventually.

Thanks for your perspective based on your own gaming experience. That is valuable insight because as I said, whether something like 70fps and 100fps mins is noticeable is entirely subjective, but experienced gamers are more likely to notice, especially on a 144Hz monitor.

I went from a 3770K @ 4.7GHz to a 8700K @ 5.0GHz and I'll readily admit that I can't notice a difference in a lot of the games that I play, but then again I am still using a Fury X (I skipped Vega and am waiting on a decent AMD GPU for my Freesync monitor, bring on Navi already!) Its probably no surprise that the game which shows the biggest difference between the two is BF1. Its not that the 3770K ran the game poorly, but the 8700K brings a level of smoothness to the game that wasn't possible on the 3770K. I'd say that level of difference would probably be similar to what we see between the 2920X and 9900K, at least with CPU intensive titles like BFV.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
My eyes might not tell the difference that much, but the counter reminds me of what's really going on.

If your eyes can't tell the difference, then what difference can it possibly make?

It's exactly the same thing as having a meter show you frequencies (sound) so high that you can't hear them.... but hey, you know they're there....
 

ChrispyjBMW

Member
Dec 6, 2018
31
8
41
I would like to thank each and everyone of you all for your perspectives and suggestions.. After heavy consideration, number crunching and praying, I've decided to go i9 9900K.. Mostly, because of gaming, and gaming at high res. Also, I thought I would share the following link. I just ordered the i9 for 508.24, yes below $510.00 from Amazon. The chip is 534 but you check a box to take off an instant 5% which drops it down to 508 before tax and free shipping.. Link is below, as well as a link to the MoBo of choice to pair with the CPU. Once again thank you all for your help and suggestions!


https://www.pcgamer.com/get-an-inte...rcSp0uCfuxCanHmM6iDh9LT6dQndvlfqgvgRMFv0lPeqI


https://www.newegg.com/Product/Product.aspx?Item=N82E16813145089
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Wrong! You'll see a difference in benchmarks but running both systems side by side you wont notice a difference between the two. A friend of mine and I did put both our systems side by side a few years back, he was running an Intel I5 3750/Nvidis 980 TiI system I was running FX 8350 with an R9 290 and we could not see a difference in gameplay.

I beg to differ with you, I don't think it will be noticeable outside of benchmarks. Got a link to share ?

How many of you guys play the Battlefield series let alone BFV? I do and it's a humungous CPU hog. I've used a 1700X, 7700k, 8700k and 9900k and yes, you can feel the difference in smoothness and frame rates, especially lows, between the CPUs. Don't feed me the single player benchmarks (that still show a sizable deficit for AMD). Most of us aren't running Ultra settings either so we can pull as high of frame rates as possible without sacrificing too much fidelity which makes the gap between Intel and AMD even larger.

Fact of the matter is AMD is behind quite a bit on BFV with high hz gaming.

The response from the AMD crowd of course is "you can't tell the difference between 60hz and 120hz", which is laughable.

https://www.youtube.com/watch?v=AsB8MPVnBoY

9900k pulls 100/151 whereas the 2920X only pulls 74/118 and heavily bottlenecks the GPU. Down the road we'll have much faster GPUs on 7nm so that gap will only widen.

If you only run 60hz you're golden with the 2600X or better, but for high hz it's Intel all the way at this point. Hopefully that changes with Zen 2, I'd love to go full blown MOAR CORES without leaving gaming performance on the table.

I went from a 3770K @ 4.7GHz to a 8700K @ 5.0GHz and I'll readily admit that I can't notice a difference in a lot of the games that I play, but then again I am still using a Fury X (I skipped Vega and am waiting on a decent AMD GPU for my Freesync monitor, bring on Navi already!) Its probably no surprise that the game which shows the biggest difference between the two is BF1. Its not that the 3770K ran the game poorly, but the 8700K brings a level of smoothness to the game that wasn't possible on the 3770K. I'd say that level of difference would probably be similar to what we see between the 2920X and 9900K, at least with CPU intensive titles like BFV.

I noticed a big difference in overall smoothness in BF1 moving from a 7700k (faster than any AMD cpu in BF games) to an 8700k.
 
Last edited:
  • Like
Reactions: ChrispyjBMW

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
701
106
Shouldn't we just enjoy playing games instead of looking at counters?
I respect the choice of the OP in going for the 9900k; the best current performer and the cheaper of the two listed options. Whilst Zen 2 probably does change the whole dynamic, he wanted to buy now, and we can't really argue that the 9900k won't serve him well for some considerable time; it clearly will if the last 5 years of CPU development is anything to go by. He certainly won't be disappointed by the 9900k I think we'd all agree.
For anyone not looking to buy immediately, waiting for Zen 2 seems to be the better option. Then again, waiting has always been the best option.
The next few years will be AMD's to own, but that only starts when Zen 2 is widely available.
 

DrMrLordX

Lifer
Apr 27, 2000
21,610
10,804
136
If your eyes can't tell the difference, then what difference can it possibly make?

It's exactly the same thing as having a meter show you frequencies (sound) so high that you can't hear them.... but hey, you know they're there....

I will notice in fps titles. I did notice back when I played Dirty Bomb (f2p fps) on my old A10-7870k and then moved to my current system. Hell even with my old 390 instead of the Vega, going from 30-50 fps to well over 100 fps was a big step in the right direction! And TF2 which I can run at 300 fps in 1440p is damn near perfect. In a game like DQXI, it's not so noticeable. The point is, if I'm going to get 144 fps in one title, I want it in pretty much all of them. I set a goal, and I'm going to reach it (or not, thanks to AMD not selling me anything better than this VegaFE, bleh).

I'm not going to selectively say, okay, I'm going to go for 144 fps in shooters but relax my requirements in some other game. 144 fps in everything. Or as close as I can get. If I didn't want that, I wouldn't have purchased a 144 MHz monitor in the first place. It was that or 4k @ 60 Hz. I chose refresh over res.

Shouldn't we just enjoy playing games instead of looking at counters?

No. If my volume knob goes to 11, I'm putting mine at 11. That counter helps me make sure I'm keeping in there.
 
  • Like
Reactions: ChrispyjBMW

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
If your eyes can't tell the difference, then what difference can it possibly make?

It's exactly the same thing as having a meter show you frequencies (sound) so high that you can't hear them.... but hey, you know they're there....

Higher fps also = lower input lag. More important for FPS type games where split second movements are key.

Ultimately @DrMrLordX is correct that if you have a 144Hz monitor then you try to achieve that if possible, or get as close as possible to that target. Otherwise you can pay half the price for 60Hz monitors with comparable image quality, not to mention use much cheaper GPUs since 60fps is achievable on all mid range GPUs.

144Hz gaming requires both a fast CPU and GPU in order to be properly utilised. In fact in some ways the CPU is more important because the majority of in game graphics settings reduces the GPU load. You can get away with a slower GPU to achieve higher fps (up to a point) by reducing graphical settings, but you can't really do much about a CPU bottleneck except perhaps reducing draw distance which puts you at a disadvantage.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Higher fps also = lower input lag. More important for FPS type games where split second movements are key.

Ultimately @DrMrLordX is correct that if you have a 144Hz monitor then you try to achieve that if possible, or get as close as possible to that target. Otherwise you can pay half the price for 60Hz monitors with comparable image quality, not to mention use much cheaper GPUs since 60fps is achievable on all mid range GPUs.

144Hz gaming requires both a fast CPU and GPU in order to be properly utilised. In fact in some ways the CPU is more important because the majority of in game graphics settings reduces the GPU load. You can get away with a slower GPU to achieve higher fps (up to a point) by reducing graphical settings, but you can't really do much about a CPU bottleneck except perhaps reducing draw distance which puts you at a disadvantage.

Agreed 100%. Higher frame rate the better and on a first person shooter I'll adjust graphics settings to keep frame rates high. On top of all that, .1% lows are a huge factor in first person shooters and on BFV Ryzen systems have more spikes and bottleneck high end GPUs.

Is it a deal breaker for many people? Nope, but I want to minimize those situations as much as possible and personally will pay good $$$ to run hardware that lets me do just that. Smoothness in a BFV title, especially BFV, requires way more CPU than people want to admit.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
I understand what you guys are saying, but just because a person can notice a difference going from 40 to 100 fps doesn't mean a person can notice a difference from 100 to 140, or even 60 to 100.

I work with audio and I can guarantee you that placebo is a thing. People "hear what they want to hear", and see what they want to see. The only way we in pro audio can actually make statements about equipment and how it's perceived is by having people subjected to rigorous blind tests. A person sitting at a computer and toggling frame rates - i.e. knowing what they are - will almost always prefer what is considered to be better before testing it. So if a person wants 144fps and thinks higher is better, then testing 80 vs 100 will pretty much guarantee that person will think they notice a difference.

So with that said: are there any blind tests out there where people have tested just how well the average gamer can actually notice a difference in frame rate without knowing what rate they're gaming at?

(for example: gamers move from one computer to the next, all set to different frame rates or sometimes the same, and the gamers then get to explain what they thought about them and guess which was which)

As for "input lag", I don't really understand how that is related to frame creation/delivery.

And lastly: I'm not saying someone should do one thing or another, I'm just throwing this out there because I know how things go when it comes to preferences (preconceived notions) vs price/performance vs stated perception... from video to audio to wine to whiskey to watches to clothes.... Sometimes it's very easy to get carried away by a number and then not seeing the forest for the trees.

I will notice in fps titles. I did notice back when I played Dirty Bomb (f2p fps) on my old A10-7870k and then moved to my current system. Hell even with my old 390 instead of the Vega, going from 30-50 fps to well over 100 fps was a big step in the right direction! And TF2 which I can run at 300 fps in 1440p is damn near perfect. In a game like DQXI, it's not so noticeable. The point is, if I'm going to get 144 fps in one title, I want it in pretty much all of them. I set a goal, and I'm going to reach it (or not, thanks to AMD not selling me anything better than this VegaFE, bleh).

I'm not going to selectively say, okay, I'm going to go for 144 fps in shooters but relax my requirements in some other game. 144 fps in everything. Or as close as I can get. If I didn't want that, I wouldn't have purchased a 144 MHz monitor in the first place. It was that or 4k @ 60 Hz. I chose refresh over res.
Higher fps also = lower input lag. More important for FPS type games where split second movements are key.

Ultimately @DrMrLordX is correct that if you have a 144Hz monitor then you try to achieve that if possible, or get as close as possible to that target. Otherwise you can pay half the price for 60Hz monitors with comparable image quality, not to mention use much cheaper GPUs since 60fps is achievable on all mid range GPUs.

144Hz gaming requires both a fast CPU and GPU in order to be properly utilised. In fact in some ways the CPU is more important because the majority of in game graphics settings reduces the GPU load. You can get away with a slower GPU to achieve higher fps (up to a point) by reducing graphical settings, but you can't really do much about a CPU bottleneck except perhaps reducing draw distance which puts you at a disadvantage.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Agreed 100%. Higher frame rate the better and on a first person shooter I'll adjust graphics settings to keep frame rates high. On top of all that, .1% lows are a huge factor in first person shooters and on BFV Ryzen systems have more spikes and bottleneck high end GPUs.

Is it a deal breaker for many people? Nope, but I want to minimize those situations as much as possible and personally will pay good $$$ to run hardware that lets me do just that. Smoothness in a BFV title, especially BFV, requires way more CPU than people want to admit.

Yeah pretty much. No disrespect to those that claim you can't notice any difference but they most likely aren't gamers, or at least gamers with 144Hz monitors because that is the first thing you notice when upgrading from 60Hz, the fluidity in movement and aiming.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Yeah pretty much. No disrespect to those that claim you can't notice any difference but they most likely aren't gamers, or at least gamers with 144Hz monitors because that is the first thing you notice when upgrading from 60Hz, the fluidity in movement and aiming.

Well at least I didn't say that it's impossible to notice a difference, I'm merely asking if this has been tested anywhere.

I mean, what do you honestly expect a gamer to say when shelling out a bunch of money on a shiny new 144Hz gaming monitor and the hardware to drive it as close to max spec as possible? "I don't see a difference" or "This is so much better!".

It's almost a given they will say they notice a difference because of that bias. And pretty much everyone hates acknowledging that bias, and even more so acknowledging not being able to discern a difference (should that be the case).
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
Moving to 144hz+ makes 60hz torturous to use in a game that supports high refresh rate.

My favorite game that supports high refresh rate is Rainbow 6 Siege.

against my i7, the 2700X from AMD can easily do 144hz. At 240hz, the i7 9700k pulls ahead.

I would argue 144hz is pretty easy to run for most CPUs. The 2920x and 9900k are too expensive imo.

OP hasn't replaced their PC since 2011(?), so I don't think the the 2920x or 9900k will be a wrong one; just overkill perhaps.

High refresh rate monitors are going down in price. $1 a hz is now a thing!
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I've been running BFV on an OC 1080 Ti and a 5820k at 4.4ghz with DDR4 @ 2800. I run a 144hz Gsync monitor. The Gsync absolutely makes a huge difference here. I can vaguely tell I am not running capped at 144hz but the Gsync covers it so well that it always feels fluid to me even when lots is going on.

So my response to this conversation would be to get an Adaptive Sync monitor, it will make a much bigger difference than the last 10-20% of CPU performance will by a country mile
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
I would like to thank each and everyone of you all for your perspectives and suggestions.. After heavy consideration, number crunching and praying, I've decided to go i9 9900K.. Mostly, because of gaming, and gaming at high res. Also, I thought I would share the following link. I just ordered the i9 for 508.24, yes below $510.00 from Amazon. The chip is 534 but you check a box to take off an instant 5% which drops it down to 508 before tax and free shipping.. Link is below, as well as a link to the MoBo of choice to pair with the CPU. Once again thank you all for your help and suggestions!


https://www.pcgamer.com/get-an-inte...rcSp0uCfuxCanHmM6iDh9LT6dQndvlfqgvgRMFv0lPeqI


https://www.newegg.com/Product/Product.aspx?Item=N82E16813145089
Good choice. Honestly unless you get into a bunch of semi pro stuff it's good enough to last you till the system dies. I do want to comment that we were under the impression you were talking High refresh rate gaming and not High resolution gaming. At that level it's mostly on GPU choice than CPU. Doesn't mean run an I3 but it does mean that the focus is taken off the CPU enough that you have a lot more flexibility in your choices. But either way neither CPU was ever going to be a bad choice. It's part of the reason I think things can tense up a bit. Enjoy your new CPU. Make sure you get a really good cooler for it (something 180TDP or better rated) and you will love it.
 

ChrispyjBMW

Member
Dec 6, 2018
31
8
41
Shouldn't we just enjoy playing games instead of looking at counters?
I respect the choice of the OP in going for the 9900k; the best current performer and the cheaper of the two listed options. Whilst Zen 2 probably does change the whole dynamic, he wanted to buy now, and we can't really argue that the 9900k won't serve him well for some considerable time; it clearly will if the last 5 years of CPU development is anything to go by. He certainly won't be disappointed by the 9900k I think we'd all agree.
For anyone not looking to buy immediately, waiting for Zen 2 seems to be the better option. Then again, waiting has always been the best option.
The next few years will be AMD's to own, but that only starts when Zen 2 is widely available.


Thank you. I wish I could wait for Zen2 but I just cannot any longer.. My last gaming rig was an X58 build with a i7 920... its 9th birthday was last week but I sold it to someone over 3 years ago and haven't had a dedicated gaming rig since.. I love the fact that the Zen2 is or possibly a 7nm processor but yes, I needed something now so I bit the bullet on a processor and getting the i9 9900K for 508 before tax was too good to pass up.. That chip was cheaper than the 2920X.
 

ChrispyjBMW

Member
Dec 6, 2018
31
8
41
Good choice. Honestly unless you get into a bunch of semi pro stuff it's good enough to last you till the system dies. I do want to comment that we were under the impression you were talking High refresh rate gaming and not High resolution gaming. At that level it's mostly on GPU choice than CPU. Doesn't mean run an I3 but it does mean that the focus is taken off the CPU enough that you have a lot more flexibility in your choices. But either way neither CPU was ever going to be a bad choice. It's part of the reason I think things can tense up a bit. Enjoy your new CPU. Make sure you get a really good cooler for it (something 180TDP or better rated) and you will love it.


Thx man! Well my apologies for not being clear.. I want to game in high rez since Im currently doing it now with a Super UHD 55 4K TV and an Xbox 1X.. I also want to do over 100Hz, preferably 144Hz. the ling below is the current cooler still in the box ready to be opened..

https://www.thermaltake.com/products-model.aspx?id=c_00002775
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
Thx man! Well my apologies for not being clear.. I want to game in high rez since Im currently doing it now with a Super UHD 55 4K TV and an Xbox 1X.. I also want to do over 100Hz, preferably 144Hz. the ling below is the current cooler still in the box ready to be opened..

https://www.thermaltake.com/products-model.aspx?id=c_00002775

It's kind of a choice of one or the other. I am sure you can pump up the refresh rate by killing some graphic options here and there but for example at 4k even then it's basically impossible in most games to push to even 100hz. Might be more reasonable on a 1440p, but outside games going on 10 years old (CSgo) or similar made for a competitive setting it's going to be hard to get that high with even the best video cards. Even then each newly released game is going to offer new challenges to maintaining that performance. Just keep that in mind. Playing in anything but the 1080p market trying to keep the refresh rate high will result in you having to upgrade your video card as faster ones become available.

That cooler is great though. Just in case you didn't know the reason you need a really good cooler is that the CPU is rated for 90w, but that only applies when the CPU is running at base clocks (never) or with a single core running in turbo (almost never but this is where the 5GHz comes in). At any other load cores will want to run at varying turbo levels as long as you have the cooling for it, Anandtech tested the 9900k at nearly 160w power usage when all cores were active.
 
  • Like
Reactions: ChrispyjBMW

ChrispyjBMW

Member
Dec 6, 2018
31
8
41
It's kind of a choice of one or the other. I am sure you can pump up the refresh rate by killing some graphic options here and there but for example at 4k even then it's basically impossible in most games to push to even 100hz. Might be more reasonable on a 1440p, but outside games going on 10 years old (CSgo) or similar made for a competitive setting it's going to be hard to get that high with even the best video cards. Even then each newly released game is going to offer new challenges to maintaining that performance. Just keep that in mind. Playing in anything but the 1080p market trying to keep the refresh rate high will result in you having to upgrade your video card as faster ones become available.

That cooler is great though. Just in case you didn't know the reason you need a really good cooler is that the CPU is rated for 90w, but that only applies when the CPU is running at base clocks (never) or with a single core running in turbo (almost never but this is where the 5GHz comes in). At any other load cores will want to run at varying turbo levels as long as you have the cooling for it, Anandtech tested the 9900k at nearly 160w power usage when all cores were active.

Thats good to know. I do want to achieve 5Ghz because that is what turned me toward this chip. so we shall see.