i9 9900X vs AMD Threadripper 2920X

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
701
106
I'd be hoping that future GPU upgrades were centred around improved power efficiency. We're pretty much at CPU limited with the 2080ti, and with little expectation of huge leaps forward in CPU performance any time soon.
 
  • Like
Reactions: ChrispyjBMW

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
I'd be hoping that future GPU upgrades were centred around improved power efficiency. We're pretty much at CPU limited with the 2080ti, and with little expectation of huge leaps forward in CPU performance any time soon.

When has a new gen of GPUs ever centred on power efficiency as it's main selling point? You need performance to sell, to convince people to upgrade. Sure, better power efficiency helps, but often that efficiency will be eaten up at the first opportunity by AMD and nVidia to up the clocks and eek out more performance. The end result will inevitably be 150 - 250W GPUs as has been the norm for the past decade basically.

I don't think we are yet at the point of all CPUs being the bottleneck. There seems to be enough headroom with the higher end CFL chips for at least another gen of GPU upgrades. Even at 1080P, a 8600K is generally enough to drive a 2080 Ti, though the 8700K/9700K/9900K is more ideal for 'future proofing' since games will inevitably become more multi-threaded.

I also expect Zen 2 to be competitive with current CFL chips in gaming, so that bodes well for future GPU upgrades too.

I do agree with you that CPU gaming performance is unlikely to far exceed 9900K levels in the short to medium term, so CFL/Zen 2 level will be it at least until 2020 it seems.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,388
16,231
136
So...what's Rosetta, the graphics card configuration? Or rather, if the 9900K (i.e. tipping a grand for CPUs) were in the running at all, surely dual EPYC options are on the table? Unless they are on allocation (backordered) for the forseeable quantum-computing-constrained future?
Rosetta (signature) is a DC application researching cancer.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
4k 144hz refresh will be more available soon and it will be the reason i upgrade from 2011. i do a fair amount of encoding but i dont mind waiting a little longer for it to finish. nothing i am doing is mission critical or work related. If it finished 1 hour later i would still rather have faster gaming, i wonder what you do that is so important.
Someone else mentioned let it run over night. I don't know really is an answer. I decided for example to Rencode my entire video collection shortly after I got my 1700 last year. Lots of stuff was using up a lot more room than it needed to. This was multi month work. If I got a 7700k instead it would have been twice as much time to do the work and use a bunch more power to do it. Not exactly in line with this one. It would take 33% faster and same power. But the point still stands, even if it's hands off, depending on what the work is you can gain days of work done quicker.

1600p 144hz is great also.

You keep stressing video work and upgrading to a 64c cpu next year. 95% of us just upgrade our graphics and keep the cpu/mainboard for a long time. Hard for me to recommend either of those chips, im hoping amd matches intels performance in games soon,
If you always run every game at ultra quality at 1440P or above then sure, you'll be mostly GPU bound. The 2080 Ti is somewhat of an exception in that it is powerful enough that even the top end Ryzen chips (and TR, by extension) are holding back the GPU. This doesn't bode well for future GPU upgrades if current gen is already starting to get bottlenecked.
1440 there is a really small difference between the CPU's 1600p and 4k there is no difference. Even on a 2080ti

https://www.youtube.com/watch?v=q7t0kA5VJ7o

Its bothersome because again its one of those things that perpetrates itself across the internet. He looks at the numbers here. ~10 % in all but one game he tested at 1080p, only one game at 1440 has a measurable difference and even then notes it is acting weird and should be considered an outlier. Yet still it's being held back solely for that 10% it loses at 1080p. It is what it has always been not a great CPU for people who do 1080p gaming , a wash at anything else.

WR to 144Hz and visual fidelity, it is quite common for gamers to adjust in game settings to get the ideal balance between frame rate and IQ. I do this all the time because I own a mid range GPU and ultra details will generally mean sub 100fps and min fps below 60, so I run at a mix of medium/high depending on the game to achieve smoother gameplay. It's a tradeoff I'm willing to accept, as often the IQ between ultra and high isn't that great, but the difference in performance is definitely noticeable.
I know you are right to some degree but stepping down this setting for 3 FPS here and that one for 2 FPS there it adds up but each on seems like a bad trade off. Sure the over all effect fidelity wise doesn't seem like much of a trade off at the end. This is probably just my gaming preference poking in but to me it doesn't seem worth it to spend nearly a grand on a video card and then killing fidelity to push the workload onto the CPU.

Even if you do own a high end GPU, it doesnt mean you are then obligated to run every game at max details. Hypothetically, if 'ultra' only nets you 100fps avg and 'high' allows 150fps, that might be a worthwhile tradeoff for 144Hz gamers, especially in competitive gaming.
Same as above.

I don't think we are yet at the point of all CPUs being the bottleneck. There seems to be enough headroom with the higher end CFL chips for at least another gen of GPU upgrades. Even at 1080P, a 8600K is generally enough to drive a 2080 Ti, though the 8700K/9700K/9900K is more ideal for 'future proofing' since games will inevitably become more multi-threaded.
This is another thing I can never wrap my head around. Again there are exception to the rules and when you are talking about the internet you can find 10k of these exceptions. But no one buys a video card in 2018 for games they are playing, then gets a new card in 2020, just so they can get better performance in games they were playing in 2018. People update video cards because there is a new game out that their current card will struggle to get the performance required by their setup. Which means a whole new measurement of where the CPU bottleneck is. As the future goes on, it's probably going to swing back to compute power instead of ST clockspeed. I could be wrong on that. But either way it's going to look more like exactly what we have today instead of somehow looking worse.

I also expect Zen 2 to be competitive with current CFL chips in gaming, so that bodes well for future GPU upgrades too.

I do agree with you that CPU gaming performance is unlikely to far exceed 9900K levels in the short to medium term, so CFL/Zen 2 level will be it at least until 2020 it seems.
If this is a measurement of 1080p gaming. I don't see how it ever really goes higher. There are some things AMD can do about latency, cache organization, IF speeds and so on to pump up performance. But really current 1080p gaming as more to do with clock speed than IPC or anything else. complex CPU work died with the PS3 and Xbox 360. If the PS5/Xbox Gen 4 are using Zen cores, this probably changes, but who knows if the last 15 years has killed the art of developing AI's and some of the other CPU heavy work. Intel will continue to push up single core turbo's for a little while. But I have to think the 9900k is about maxed on what Intel all core turbo they can do as the CPU starts nearing 200w to accomplish it. But that house of cards is crashing down and Intel's inability to get the clock rates and yields they want in 10nm proves it. They can go wider cores, but that's not going to help high refresh gaming, they can join AMD in the core mania and they are already trying, but again not going to help high refresh gaming.
You keep stressing video work and upgrading to a 64c cpu next year. 95% of us just upgrade our graphics and keep the cpu/mainboard for a long time. Hard for me to recommend either of those chips, im hoping amd matches intels performance in games soon,
I stress video work not because it's the most important factor to this or many users that still worry about it. Because I think people disregard it as an afterthought just because he said gaming is his priority. I notated why it could still matter above. I mention the CPU because whenever he is looking to upgrade. Whether it's in 2020 or 2025. He has something to look for. Maybe it's something as simple a he is getting a new system in 2025. But he wants to use this one for something else, host VM's, do large queues in handbreak, run a bunch of server tools like Exchange, SQL. Or maybe turn it into a dedicated folding machine.

If he gets the 9900k. He will look for a CPU update. If he is really lucky he will find some 10900k or what ever Intel calls the Comet Lake top i9. My personal guess is that the 9900k is the top dog. The end. Won't be terrible but that's what it is.

If he gets the 2920x he will have an insane amount of options. Just in the current TR options it will go up to 32c. Next year 64c. Who knows with Zen 3 and if it's supported (I think it will be). But even then the options are endless. Just looking at the 1950x as Mark brings up 16c CPU that last year was $1000, then $750, and now just over $450. That's a new retail CPU. Heck you can't ever really find that happening on Intel CPU's it basically stays the same price, till intel stops producing it, then several years down the road maybe just maybe you can find a new at a ~$100 saving. So lets say you are the OP, looking for a job in retirement for the system. You are dedicating the machine to folding. Who knows you might be able to get a 64c CPU nib for $500. How great would that be? Or next year with Zen2. He gets the 2920x and in two years the next big video card that comes out and does what no other GPU has ever done and completely generational leaps over new games and makes everything CPU bound at all levels. Well who knows what Zen2 clocks at. My guess is top turbo's hang around 4.3-4.5GHz. Really closing the gap. "IPC" wise AMD makes even a 7% jump all of a sudden it's nipping at the heels of the 9900k. He waits a year and gets a good discount on 3920 (I know they are changing the names but basically saying new approximate CPU) or 3950. Cept not only are they clocking higher. They also are 24c and 32c CPU's. Doesn't that seem like an option to keep open?

Some of that is wishful thinking. Some of that might not apply to the OP. People haven't been upgrading CPU's in 10 years because the upgrades have been small (8 years of 4 cores does that to a generation) or non-existent because of no change in configuration and at most a single gen of upgrade-ability. The 9900k is stuck in the old way. Likely the last and highest supported 390z CPU. If Intel changes their mind for the first time in over a decade. There might be 1, 1 CPU that the OP could get years later to keep the PC puttering around. Not even worth putting the effort into upgrading it. 2920x, maybe he doesn't ever upgrade it. But isn't it great to look in the future see all of these new CPU's that are going to come out in the next couple years and instead of going "well I would need to take out the board and the CPU to do that (and be like me and say might as well just build a new box)" you say "oh wow, that new CPU looks great, maybe if I can get a real good deal on it, I just upgrade my BIOS and I am ready to go"?

I'll give you an example I have given before. My current machine is a 1700, the system before it was a 3930K. I was really happy with 3930k. It was a great CPU. Probably my 3rd favorite purchase next to my 4400+ (Favorite) and my 1700. Anyways long story short. I needed more cores to work on a work project at home. I could have probably made due with the 3930k but some stuff was going on with it. So I was looking to upgrade my CPU. Was going to go with a 6900k. That was going to cost me around 1500-1700 to do that upgrade. New CPU $1000. New board $300-$350. New memory ~$250. I was all prepped and ready to go almost put in the order when HardOCP had a teaser post in their forums basically calling Ryzen a monster. Waited for the benchmarks and ended up building a whole new computer for the cost of that upgrade. That said if Intel had the 6900k able to use the same boards as the 3930k even at it's $1000 vs $330 price. I would have gone with the 6900K. How many people are still using their Sandy Bridge or Ivy Bridge CPU's not because they are still fast enough (and in a lot they are) but because it's just fast enough to make the idea of swapping out those major components not worth it. CPU upgrading is a lost art. I don't want to discount it based of 10 years of upgradibility self sabotage.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
@Topweasel, wow that is a massive post and I don't have the time to reply to every point that you made. However, you seem to be making up scenarios that the OP *may* do in order to justify a 2920X, such as upgrade paths to 32C TRs and such... that is frankly a ridiculous suggestion for a gaming PC. If you need to make up scenarios where the OP (primarily a gamer) needs a 32C or higher TR then we are getting WAY off topic and beyond the scope of the advice the OP was looking for.

So let's focus on his needs first and foremost.

1) Gaming. This is priority. High refresh rate gaming to be exact. This is exactly what the 9900K is good at. The 2920X doesn't come close in this regard. Using potential GPU bottlenecks to justify the 2920X as 'almost equal' is absurd, because there are always games that are CPU bound even at 1440P. The less CPU intensive games will run almost as well on the 2920X, that is true. That or the differences are meanlingless, ie. where min fps exceeds 100fps on both platforms. However, there are games that are historically CPU bound (I used BF:V as an example earlier) and will show differences of up to 35% between a 9900K and 2920X in terms of min fps. This is where a faster CPU is needed to properly 'feed' a fast GPU.

In terms of 'overall' gaming performance, a 9900K is 20% ahead at 1080P and 12% ahead at 1440P with a 1080 Ti, according to TPU: https://www.techpowerup.com/reviews/AMD/Ryzen_Threadripper_2920X/13.html

With a 2080 Ti, those margins will naturally be larger, especially at 1440P. Had TPU redone their testing with a 2080 Ti, I would expect the 1440P margin to be well in excess of 15%, perhaps even close to 20%. That right there, should be reason enough to suggest a 9900K over a 2920X for the OP. The 2920X is not an ideal pairing for a fast GPU. Have a closer look at the TPU results, it actually performs worse than a *Core i3 8300* for gaming. Let that sink in for a second... an entry level 4C/4T CPU outperforming a 12C/24T CPU for gaming. According to your logic an i3 is all that is needed for a high end GPU then? Maybe we should all match a 2080 Ti with Core i3s. Or maybe not...

I think I made my point clear about gaming. But lets talk about his secondary uses as well...

2) Video editing. This generally isn't as highly multi-threaded as video encoding, and a mix of high IPC/clocks and cores will generally win out here. See the 9900K beating the higher core count 7900X / 1920X in Premiere Pro here: https://www.pugetsystems.com/labs/a...2018-Core-i7-9700K-i9-9900K-Performance-1254/

3. Video encoding. This is the only domain where the 2920X shows any potential advantage over the 9900K, and it is still a mixed bag depending on what codec you use https://www.anandtech.com/show/13516/the-amd-threadripper-2-cpu-review-pt2-2970wx-2920x/8

According to Anandtech, the 2920X has a 5 - 6% advantage in x264 encoding. However, the 9900K has a 10% advantage in HEVC encoding.

So the 9900K has a clear advantage in gaming (most importantly), video editing and trades blows with the 2920X in video encoding. That, plus it has a significant price advantage. 9900K + Z390 motherboard should cost between $700 - $800, depending on the choice of motherboard. A 2920X + X399 motherboard will set you back $950 - $1050.

So you're suggesting paying an extra $250 to achieve noticeably lower performance in the OP's primary use of gaming, while his secondary uses perform just as well, if not better, on the 9900K as well. All just for more PCI-E lanes that aren't really an issue for a desktop gaming PC, and the option to upgrade to potentially a 32C TR, which is totally pointless for gaming.

We'll probably have to agree to disagree on this, but IMO the advantages of the 9900K far outweigh the negatives against the 2920X for the OP's intended usage.
 
Last edited:

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
yes, you are correct 9900K, my apologies as I had X on the brain!

Predominately gaming, will do video editing and encoding, regular surfing and Office task but gaming all the way.
This is what the TS said.

I stress video work not because it's the most important factor to this or many users that still worry about it. Because I think people disregard it as an afterthought just because he said gaming is his priority. I notated why it could still matter above. I mention the CPU because whenever he is looking to upgrade. Whether it's in 2020 or 2025. He has something to look for. Maybe it's something as simple a he is getting a new system in 2025. But he wants to use this one for something else, host VM's, do large queues in handbreak, run a bunch of server tools like Exchange, SQL. Or maybe turn it into a dedicated folding machine.
You've latched on to video encoding as if the 9900k is a slouch in that department. The rest of your post is about scenarios where in your imagination, the TR 1920x may have an edge, but I won't be surprised if the 9900k holds its own in those NON-DESKTOP ENVIRONMENTS too!

Let me just leave these here, from the Anandtech Review: https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/8

102026.png
102027.png
102028.png
 
  • Like
Reactions: ChrispyjBMW

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
@Topweasel, wow that is a massive post and I don't have the time to reply to every point that you made. However, you seem to be making up scenarios that the OP *may* do in order to justify a 2920X, such as upgrade paths to 32C TRs and such... that is frankly a ridiculous suggestion for a gaming PC. If you need to make up scenarios where the OP (primarily a gamer) needs a 32C or higher TR then we are getting WAY off topic and beyond the scope of the advice the OP was looking for.

I am giving examples. Not trying to shoe horn the OP into something just pointing out things to keep in mind. I understand I am being a contrarian here.But really I hate echo chambers. I want the OP or anyone who asks these types of questions to know both sides so they can make a choice for themselves. They can't do that if 90% of the people out there immeadiately answer 9900k because someone said gaming was a priority. Even if it high refresh gaming, because again I think people don't realize the sacrifices they have to make to pull this off. I personally and I admit my bias here that I think they are closer than people give the 2920x or Ryzen in general credit for. There are tons of ways the reviews shape around pre established bias's to present the information the reviewer, blogster, or Tuber wants to show. That why I like to look at alternatives.

Take high refresh gaming. Trying to maintain 144hz display you pretty much can't run at higher then 1080p. I think personally as CPU usage increases it will be come harder and harder to maintain it even at that because we aren't going to see faster CPU's. We are probably nearing the end of a primary thread that accounts for most of the the super high frames. But even then, even if it was the case, even if right now 144hz on a 2080ti was really a thing at 1440. That dies with the next game, or the game after that. Needing a new 3080 ti or Titan XXX or something to keep up. Trade off to something like. 90-100 FPS, most other CPU's can accomplish that and his GPU lasts him for more then half a game generation. The OP doesn't know that if we just drone on about 9900k being the only CPU for high refresh gaming.

I was thinking about replying to most of the examples. Maybe give some others that points in the other direction. But then I am probably feeding off my own Bias there. In the end 9900k or the 2920x are both great CPU's. The 9900k is no slouch in multitasking. In the end it's the safest pick for today even if I think it's a little over priced. I will always have an issue with forced obsolescence. Intel forcing the hand on one end and the high refresh gaming on the other. I can't make sense of making a cheap XboxOneX look better than PC game for a couple extra frames per second on an LCD of all things. But that's me. I am not saying the OP will ever want to upgrade to a 32 or 64c CPU. But I also don't like making a selection when there is an alternative instead of one that says you are stuck right where you are. That should be old news, but it isn't, and won't be till Intel picks a platform and stays with it for more then a second Gen. But again without someone answering the question differently the OP wouldn't know there are options, options that open other options.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,388
16,231
136
This is what the TS said.


You've latched on to video encoding as if the 9900k is a slouch in that department. The rest of your post is about scenarios where in your imagination, the TR 1920x may have an edge, but I won't be surprised if the 9900k holds its own in those NON-DESKTOP ENVIRONMENTS too!

Let me just leave these here, from the Anandtech Review: https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/8

View attachment 1411
View attachment 1412
View attachment 1413
The problem with those, is one handpicked bench, it wins. Also the price. Its $420 not 800 for the 1920x, and the 9900k is $569 not $488
 

DrMrLordX

Lifer
Apr 27, 2000
23,178
13,265
136
Take high refresh gaming. Trying to maintain 144hz display you pretty much can't run at higher then 1080p.

I don't think that's true. I think the 9900k can do it at 1440p for some titles, and I'm pretty sure Zen2 will get there too. Comet Lake won't change that picture, but Zen3 sure will . . .

The problem with those, is one handpicked bench, it wins. Also the price. Its $420 not 800 for the 1920x, and the 9900k is $569 not $488

Yeah those prices don't reflect reality. Though with the OP's apparent budget, he'd be better off with a 2920X or 2950X than a 1920X. And for his use case (144 MHz gaming beyond 1080p), I think he should either get a 9900k or wait for Zen2. Not much else on the market will do what he wants.
 

killster1

Banned
Mar 15, 2007
6,205
475
126
I am giving examples. Not trying to shoe horn the OP into something just pointing out things to keep in mind. I understand I am being a contrarian here.But really I hate echo chambers. I want the OP or anyone who asks these types of questions to know both sides so they can make a choice for themselves. They can't do that if 90% of the people out there immeadiately answer 9900k because someone said gaming was a priority. Even if it high refresh gaming, because again I think people don't realize the sacrifices they have to make to pull this off. I personally and I admit my bias here that I think they are closer than people give the 2920x or Ryzen in general credit for. There are tons of ways the reviews shape around pre established bias's to present the information the reviewer, blogster, or Tuber wants to show. That why I like to look at alternatives.

Take high refresh gaming. Trying to maintain 144hz display you pretty much can't run at higher then 1080p. I think personally as CPU usage increases it will be come harder and harder to maintain it even at that because we aren't going to see faster CPU's. We are probably nearing the end of a primary thread that accounts for most of the the super high frames. But even then, even if it was the case, even if right now 144hz on a 2080ti was really a thing at 1440. That dies with the next game, or the game after that. Needing a new 3080 ti or Titan XXX or something to keep up. Trade off to something like. 90-100 FPS, most other CPU's can accomplish that and his GPU lasts him for more then half a game generation. The OP doesn't know that if we just drone on about 9900k being the only CPU for high refresh gaming.

I was thinking about replying to most of the examples. Maybe give some others that points in the other direction. But then I am probably feeding off my own Bias there. In the end 9900k or the 2920x are both great CPU's. The 9900k is no slouch in multitasking. In the end it's the safest pick for today even if I think it's a little over priced. I will always have an issue with forced obsolescence. Intel forcing the hand on one end and the high refresh gaming on the other. I can't make sense of making a cheap XboxOneX look better than PC game for a couple extra frames per second on an LCD of all things. But that's me. I am not saying the OP will ever want to upgrade to a 32 or 64c CPU. But I also don't like making a selection when there is an alternative instead of one that says you are stuck right where you are. That should be old news, but it isn't, and won't be till Intel picks a platform and stays with it for more then a second Gen. But again without someone answering the question differently the OP wouldn't know there are options, options that open other options.

you dont have to maintain 144fps to enjoy more than 60, also what did you encode your movies to? and what format did you have them to begin with? I never considered the power usage that 3930k uses compared to a 1700x system for my encodes and i usually worry about power usage etc in the summer, so i will either stop encoding much ;P or get something more efficient. Right now i dont mind extra heat and power usage turns into a nice heater for me at night. I am trying to encode mostly to 265 but have yet to dial in a setting i like the most.
 
  • Like
Reactions: ChrispyjBMW

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,388
16,231
136
You know this thread has gone all over the place. I want to try and summarize it for the op, and see who else agrees in both camps.

First, in gaming, we all know the 9900k is king, But with a higher resolution, you will never see the difference in the 2 CPU's we are talking about. And I am pretty sure he said he was in a high res.

For encoding, yes the 9900k is not a slouch, but overall the 1920x is faster on average for most encoding apps.

Price, the 1920x is cheaper for the chip, but the platform is a little more, so its a wash.

For upgradability and platform "futureproofing", I think the 1920x wins.

Hence all the arguing, its really close, and depends on that the OP values most.
 

killster1

Banned
Mar 15, 2007
6,205
475
126
You know this thread has gone all over the place. I want to try and summarize it for the op, and see who else agrees in both camps.

First, in gaming, we all know the 9900k is king, But with a higher resolution, you will never see the difference in the 2 CPU's we are talking about. And I am pretty sure he said he was in a high res.

For encoding, yes the 9900k is not a slouch, but overall the 1920x is faster on average for most encoding apps.

Price, the 1920x is cheaper for the chip, but the platform is a little more, so its a wash.

For upgradability and platform "futureproofing", I think the 1920x wins.

Hence all the arguing, its really close, and depends on that the OP values most.


I havent seen any site showing 1920x vs 9900k 4k 2080ti benches. is there a link i should see? Of course we know you are biased to the TR you own 4 of them right? Futureproofing is not having to upgrade your cpu to catch up with last years intel ;P

Member callouts and accusations are not allowed in the tech forum.

Daveybrat

AT Moderator
 
Last edited by a moderator:
  • Like
Reactions: ChrispyjBMW

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,388
16,231
136
I havent seen any site showing 1920x vs 9900k 4k 2080ti benches. is there a link i should see? Of course we know you are biased to the TR you own 4 of them right? Futureproofing is not having to upgrade your cpu to catch up with last years intel ;P
Did you even read half the posts here ? Virtually everyone agreed that there was no difference at high res.
Oh, and I have 4 Xeons and 4 TR, so I am biased to AMD ? Oh, and you think the next Intel chips will work on the current platform ?

Who is biased here.
 

killster1

Banned
Mar 15, 2007
6,205
475
126
Did you even read half the posts here ? Virtually everyone agreed that there was no difference at high res.
Oh, and I have 4 Xeons and 4 TR, so I am biased to AMD ? Oh, and you think the next Intel chips will work on the current platform ?

Who is biased here.

So is there or is there not a site with benchmark of 4k games on 2080ti with both cpu's? Virtually everyone agreed 9900k is faster in games! no one said tr matched at any res. The op #1 said its for games, so fun.. So if i can be pointed to the benches gtav, Bfv, few others would be nice.
Once again the TR group points to the upgrade of the cpu so soon.. who cares if the next gen this or that. They want it now right? we can always wait for better in the future but is he planning on constant cpu upgrades? or a gfx upgrade more likely.
 
  • Like
Reactions: ChrispyjBMW

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
You know this thread has gone all over the place. I want to try and summarize it for the op, and see who else agrees in both camps.

First, in gaming, we all know the 9900k is king, But with a higher resolution, you will never see the difference in the 2 CPU's we are talking about. And I am pretty sure he said he was in a high res.

For encoding, yes the 9900k is not a slouch, but overall the 1920x is faster on average for most encoding apps.

Price, the 1920x is cheaper for the chip, but the platform is a little more, so its a wash.

For upgradability and platform "futureproofing", I think the 1920x wins.

Hence all the arguing, its really close, and depends on that the OP values most.
Actually I don't believe the OP specified the gaming resolution, but said he wanted to move to a 144Hz monitor. The only options for 144Hz are 1080P and 1440P. I guess you can call 1440P 'high res' but with a fast enough GPU (1080 Ti / 2080 / 2080 Ti) you'll see a noticeable difference between a 9900K and 2920X. Not in all games, but in CPU bound titles like BF:V, Far Cry 5 etc there is a difference.

Also, the thread title says 2920X, not 1920X, so I'm not sure where you're going with those comparisons.

At current prices the 9900K is between $530 - $560, the 2920X $650. That's before taking into account platform costs, where X399 is at approx $150 more than Z390. Then you have quad channel memory which I believe costs a bit more than dual channel (haven't checked on this one)

I believe Zucker2K just linked to the Anandtech 9900K review showing the 9900K beating the 1920X at Handbrake x264 and HEVC encoding. The 2920X on the other hand is slightly faster at x264 but slower at HEVC, so I'll call that a wash.

I've also showed links showing the 9900K being the fastest platform for video editing, at least when using Premiere Pro as the software.

So let's break it down:
9900K + Z390 = $750 approx
2920X + X399 = $1000 approx

9900K wins in gaming, video editing and trades blows in encoding, dependant on the codec, while costing $250 less.

That's a clear win for the 9900K in my books.

Upgradeability? Sure, I'll give TR a tick on that one. The point is that a 9900K won't need to be upgraded for a long time, at least for gaming purposes. A 2920X on the other hand? It already bottlenecks a 2080 Ti. Good luck running an RTX 3080 on that, so it's probably a good thing that you can upgrade to a Zen2 based TR down the track. But why go though all that extra expense in the first place, especially to get a slower system today?!

I really don't get that logic. Are people really advocating the OP pay $250 more today, to get a slower gaming system, then spend another $650 for an upgrade to a next gen TR so that he can finally 'catch up' to 9900K levels of gaming performance?!
 
Last edited:

Heclone

Junior Member
Dec 7, 2018
19
11
81
Hi everyone,

Even if there already are many interesting options over here, I want to suggest another path.

What about grabbing a current Zen+ CPU and an AM4 motherboard in order to pick a top-end Zen2 AM4 CPU in less than a year ?

Let's think about grabbing a ~160-180$ R5 2600, and some ~150$ X470 Motherboard now, and in less than a year a R7 3700X for ~350$.
It would cost slightly more than 700$ (even less because the 2600 can be sold) and not all at once.

The 2600 sure is a slower option on every aspect. However, coming from a 2011 platform, it won't seems that bad at all (either on gaming or in productivity), it will be cheaper, and it will allows to upgrade to a CPU that would at least matches the 9900K in gaming (or it'll be only a matter of 2-3% even in 1080p) and outperforms it and the 2920X in MT-Tasks (granted that the 3700X is a 12c/24t CPU, which seems very very likely).

Ultimately, it would be cheaper, and within a year grants you more overall performance than what a 2920X or a 9900k could deliver. But you'll have to wait a bit with a smaller jump (the 2600).

Concerning my 2cents about the 9900k and 2920X :

The 9900k is clearly too expensive according to me, however if you want an incredible gaming CPU that can grants very good video encoding performance as well, it's the best polyvalent CPU you can pick. And in this kind of situation it would makes sense to me, even more if you want max perf right now.

On the other hand I love TR, but if gaming is and will be an important use case, the 9900k is far superior. And even considering the upgrade : if Zen2 Threadripper CPUs will be better than actual TR, I don't think they will be better in gaming than Zen2 Ryzen CPUs.
 
Last edited:
  • Like
Reactions: ChrispyjBMW

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Hi everyone,

Even if there already are many interesting options over here, I want to suggest another path.

What about grabbing a current Zen+ CPU and an AM4 motherboard in order to pick a top-end Zen2 AM4 CPU in less than a year ?

Let's think about grabbing a ~160-180$ R5 2600, and some ~150$ X470 Motherboard now, and in less than a year a R7 3700X for ~350$.
It would cost slightly more than 700$ (even less because the 2600 can be sold) and not all at once.

The 2600 sure is a slower option on every aspect. However, coming from a 2011 platform, it won't seems that bad at all (either on gaming or in productivity), it will be cheaper, and it will allows to upgrade to a CPU that would at least matches the 9900K in gaming (or it'll be only a matter of 2-3% even in 1080p) and outperforms it and the 2920X in MT-Tasks (granted that the 3700X is a 12c/24t CPU, which seems very very likely).

Ultimately, it would be cheaper, and within a year grants you more overall performance than what a 2920X or a 9900k could deliver. But you'll have to wait a bit with a smaller jump (the 2600).

Concerning my 2cents about the 9900k and 2920X :

The 9900k is clearly too expensive according to me, however if you want an incredible gaming CPU that can grants very good video encoding performance as well, it's the best polyvalent CPU you can pick. And in this kind of situation it would makes sense to me, even more if you want max perf right now.

On the other hand I love TR, but if gaming is and will be an important use case, the 9900k is far superior. And even considering the upgrade : if Zen2 Threadripper CPUs will be better than actual TR, I don't think they will be better in gaming than Zen2 Ryzen CPUs.

We have no concrete numbers on next gen Ryzen with regards to core count and especially pricing, which is usually set close to launch. Apart from those 'leaks' how can you possibly have enough information to formulate an upgrade path for the OP? Do you have any proof next gen Ryzen 7 will be 12 core and that it will be $350? Do you have any proof it will perform like a 9900K for gaming? Its all based on 'hope' and while I don't necessarily disagree that next gen Ryzen 7 may indeed be 12C and that it will close the gap (or match or beat) a 9900K, there is also the chance that Ryzen 7 may remain 8C and that the gap in gaming may not be totally bridged - my point is we just don't know yet and we shouldn't pretend to know until reliable info exists, and that is still months away.

WR to the 9900K vs 2920X, if you think the 9900K is overpriced then surely the 2920X is as well? Using Amazon prices the 2920X is $650 and the 9900K is $520 (after coupon), so the 2920X costs 25% more and in the vast majority of cases doesn't outperform the 9900K by 25% - in fact I would argue for general desktop usage (like the OPs, gaming and a bit of video editing/encoding) the 9900K is superior with its faster clocks and higher IPC.

Again, I don't necessarily disagree with you about the 9900K price - I have argued since launch that $450 would be a fairer price point, considering the 2700X price of approx $300. But for the OPs use case, it would serve him very well. You have leading performance for high Hz gaming and video editing, and comparable encoding performance to a 2920X, all for a lower price. For the right here, right now, nothing else can really match it. Zen 2 may match or surpass it, but again, there are no guarantees. We can't make plans based off fantasies about 16C Ryzens at 5GHz etc etc.
 

Heclone

Junior Member
Dec 7, 2018
19
11
81
About the 2920X, I also agree that it is overpriced, especially considering the fact that you can grab 1950X for less money. I just didn't talk about it because in this situation where gaming is a priority (and video editing/encoding only a second use) I don't think TR makes much sense.

And yes, my suggestion is based on assumptions but I assume it is a fair (and realistic) bet, granted to the fact that at worst it would be a bit less powerfull than a 9900k but at best can be really more powerfull especially coming to video editing/encoding.

But you're right, it does add a bit of random in the equation, and the 9900K is the no-compromise choice when you aren't on a budget build. I just wanted to present another possible path that I would myself choose, but I am just me and I should have insisted more on the fact that's a bet.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,388
16,231
136
Actually I don't believe the OP specified the gaming resolution, but said he wanted to move to a 144Hz monitor. The only options for 144Hz are 1080P and 1440P. I guess you can call 1440P 'high res' but with a fast enough GPU (1080 Ti / 2080 / 2080 Ti) you'll see a noticeable difference between a 9900K and 2920X. Not in all games, but in CPU bound titles like BF:V, Far Cry 5 etc there is a difference.

Also, the thread title says 2920X, not 1920X, so I'm not sure where you're going with those comparisons.

At current prices the 9900K is between $530 - $560, the 2920X $650. That's before taking into account platform costs, where X399 is at approx $150 more than Z390. Then you have quad channel memory which I believe costs a bit more than dual channel (haven't checked on this one)

I believe Zucker2K just linked to the Anandtech 9900K review showing the 9900K beating the 1920X at Handbrake x264 and HEVC encoding. The 2920X on the other hand is slightly faster at x264 but slower at HEVC, so I'll call that a wash.

I've also showed links showing the 9900K being the fastest platform for video editing, at least when using Premiere Pro as the software.

So let's break it down:
9900K + Z390 = $750 approx
2920X + X399 = $1000 approx

9900K wins in gaming, video editing and trades blows in encoding, dependant on the codec, while costing $250 less.

That's a clear win for the 9900K in my books.

Upgradeability? Sure, I'll give TR a tick on that one. The point is that a 9900K won't need to be upgraded for a long time, at least for gaming purposes. A 2920X on the other hand? It already bottlenecks a 2080 Ti. Good luck running an RTX 3080 on that, so it's probably a good thing that you can upgrade to a Zen2 based TR down the track. But why go though all that extra expense in the first place, especially to get a slower system today?!

I really don't get that logic. Are people really advocating the OP pay $250 more today, to get a slower gaming system, then spend another $650 for an upgrade to a next gen TR so that he can finally 'catch up' to 9900K levels of gaming performance?!
Well, you are right on one point, the title is 2920x, not 1920x. I got off on that tangent in reply to Zucker2k's post with the benchmark. So that invalidates his argument, as the wrong chip was mentioned, and it invalidates one of your points as well, since you mentioned that.

My post was meant as a summation, first that high res (1440p to 4k) gaming between the two chips is either not there, or noticeable only in benchmarks. Second that encoding CAN be noticeable and in the majority of benchmarks, the 2920x gets the nod (that I have seen). Next the platform is better for TR (you even admitted that) . And I said its really up to the OP, and which way he leans.
 

kawi6rr

Senior member
Oct 17, 2013
567
156
116
Which do you advise?

Let me reiterate, you'll see a difference in benchmarks but if you put those two chips in a system side by side you wont see a difference. I would recommend getting which chip better suits your needs. If you're going strictly gaming then get the Intel chip but if you're doing more then just gaming I would go with AMD.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Let me reiterate, you'll see a difference in benchmarks but if you put those two chips in a system side by side you wont see a difference. I would recommend getting which chip better suits your needs. If you're going strictly gaming then get the Intel chip but if you're doing more then just gaming I would go with AMD.

Wrong. In titles like BFV you'll see massive performance differences in .1% framerates and in overall lows between the two chips assuming you're running a high end GPU or target your graphics settings for high hz gameplay. That gap will widen with next gen GPUs.

Zen 2 will likely reduce that performance gap, but telling people they will see no difference simply isn't true.
 
  • Like
Reactions: ChrispyjBMW

kawi6rr

Senior member
Oct 17, 2013
567
156
116
Wrong. In titles like BFV you'll see massive performance differences in .1% framerates and in overall lows between the two chips assuming you're running a high end GPU or target your graphics settings for high hz gameplay. That gap will widen with next gen GPUs.

Zen 2 will likely reduce that performance gap, but telling people they will see no difference simply isn't true.

Wrong! You'll see a difference in benchmarks but running both systems side by side you wont notice a difference between the two. A friend of mine and I did put both our systems side by side a few years back, he was running an Intel I5 3750/Nvidis 980 TiI system I was running FX 8350 with an R9 290 and we could not see a difference in gameplay.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,388
16,231
136
Wrong. In titles like BFV you'll see massive performance differences in .1% framerates and in overall lows between the two chips assuming you're running a high end GPU or target your graphics settings for high hz gameplay. That gap will widen with next gen GPUs.

Zen 2 will likely reduce that performance gap, but telling people they will see no difference simply isn't true.
I beg to differ with you, I don't think it will be noticeable outside of benchmarks. Got a link to share ?
 

DrMrLordX

Lifer
Apr 27, 2000
23,178
13,265
136
@Heclone

Zen2 isn't out yet. The OP mentioned two specific chips, so we've had to recommend one or the other. I try to stay on-topic.

If he's willing to wait, I would recommend he NOT invest in X470 but instead get an entirely new motherboard and chip when Zen2 launches. X570 should offer a much better user experience for Zen2-based chips. The one weakness of AMD's AM4 strategy appears to be that new AGESA versions are hard(er) to roll out for old AM4 boards. The OEMs have less incentive to do it, so they drag their feet. I figure that a proper UEFI for Zen2 will be out by May/June, but probably not on day one. There will be some half-arsed compatibility updates. Also rolling out new AGESA versions for X370/X470 boards may make older chips work less . . . well on those boards. My 1800x has restricted memory speed thanks to newer AGESA versions for my X370 Taichi. My best UEFI rev was 3.30 but that didn't have all the Spectre fixes sooooo meh.

Zen2 may shake up the entire desktop PC market, making both the 9900k and some TR parts irrelevant.