4k 144hz refresh will be more available soon and it will be the reason i upgrade from 2011. i do a fair amount of encoding but i dont mind waiting a little longer for it to finish. nothing i am doing is mission critical or work related. If it finished 1 hour later i would still rather have faster gaming, i wonder what you do that is so important.
Someone else mentioned let it run over night. I don't know really is an answer. I decided for example to Rencode my entire video collection shortly after I got my 1700 last year. Lots of stuff was using up a lot more room than it needed to. This was multi month work. If I got a 7700k instead it would have been twice as much time to do the work and use a bunch more power to do it. Not exactly in line with this one. It would take 33% faster and same power. But the point still stands, even if it's hands off, depending on what the work is you can gain days of work done quicker.
1600p 144hz is great also.
You keep stressing video work and upgrading to a 64c cpu next year. 95% of us just upgrade our graphics and keep the cpu/mainboard for a long time. Hard for me to recommend either of those chips, im hoping amd matches intels performance in games soon,
If you always run every game at ultra quality at 1440P or above then sure, you'll be mostly GPU bound. The 2080 Ti is somewhat of an exception in that it is powerful enough that even the top end Ryzen chips (and TR, by extension) are holding back the GPU. This doesn't bode well for future GPU upgrades if current gen is already starting to get bottlenecked.
1440 there is a really small difference between the CPU's 1600p and 4k there is no difference. Even on a 2080ti
https://www.youtube.com/watch?v=q7t0kA5VJ7o
Its bothersome because again its one of those things that perpetrates itself across the internet. He looks at the numbers here. ~10 % in all but one game he tested at 1080p, only one game at 1440 has a measurable difference and even then notes it is acting weird and should be considered an outlier. Yet still it's being held back solely for that 10% it loses at 1080p. It is what it has always been not a great CPU for people who do 1080p gaming , a wash at anything else.
WR to 144Hz and visual fidelity, it is quite common for gamers to adjust in game settings to get the ideal balance between frame rate and IQ. I do this all the time because I own a mid range GPU and ultra details will generally mean sub 100fps and min fps below 60, so I run at a mix of medium/high depending on the game to achieve smoother gameplay. It's a tradeoff I'm willing to accept, as often the IQ between ultra and high isn't that great, but the difference in performance is definitely noticeable.
I know you are right to some degree but stepping down this setting for 3 FPS here and that one for 2 FPS there it adds up but each on seems like a bad trade off. Sure the over all effect fidelity wise doesn't seem like much of a trade off at the end. This is probably just my gaming preference poking in but to me it doesn't seem worth it to spend nearly a grand on a video card and then killing fidelity to push the workload onto the CPU.
Even if you do own a high end GPU, it doesnt mean you are then obligated to run every game at max details. Hypothetically, if 'ultra' only nets you 100fps avg and 'high' allows 150fps, that might be a worthwhile tradeoff for 144Hz gamers, especially in competitive gaming.
Same as above.
I don't think we are yet at the point of all CPUs being the bottleneck. There seems to be enough headroom with the higher end CFL chips for at least another gen of GPU upgrades. Even at 1080P, a 8600K is generally enough to drive a 2080 Ti, though the 8700K/9700K/9900K is more ideal for 'future proofing' since games will inevitably become more multi-threaded.
This is another thing I can never wrap my head around. Again there are exception to the rules and when you are talking about the internet you can find 10k of these exceptions. But no one buys a video card in 2018 for games they are playing, then gets a new card in 2020, just so they can get better performance in games they were playing in 2018. People update video cards because there is a new game out that their current card will struggle to get the performance required by their setup. Which means a whole new measurement of where the CPU bottleneck is. As the future goes on, it's probably going to swing back to compute power instead of ST clockspeed. I could be wrong on that. But either way it's going to look more like exactly what we have today instead of somehow looking worse.
I also expect Zen 2 to be competitive with current CFL chips in gaming, so that bodes well for future GPU upgrades too.
I do agree with you that CPU gaming performance is unlikely to far exceed 9900K levels in the short to medium term, so CFL/Zen 2 level will be it at least until 2020 it seems.
If this is a measurement of 1080p gaming. I don't see how it ever really goes higher. There are some things AMD can do about latency, cache organization, IF speeds and so on to pump up performance. But really current 1080p gaming as more to do with clock speed than IPC or anything else. complex CPU work died with the PS3 and Xbox 360. If the PS5/Xbox Gen 4 are using Zen cores, this probably changes, but who knows if the last 15 years has killed the art of developing AI's and some of the other CPU heavy work. Intel will continue to push up single core turbo's for a little while. But I have to think the 9900k is about maxed on what Intel all core turbo they can do as the CPU starts nearing 200w to accomplish it. But that house of cards is crashing down and Intel's inability to get the clock rates and yields they want in 10nm proves it. They can go wider cores, but that's not going to help high refresh gaming, they can join AMD in the core mania and they are already trying, but again not going to help high refresh gaming.
You keep stressing video work and upgrading to a 64c cpu next year. 95% of us just upgrade our graphics and keep the cpu/mainboard for a long time. Hard for me to recommend either of those chips, im hoping amd matches intels performance in games soon,
I stress video work not because it's the most important factor to this or many users that still worry about it. Because I think people disregard it as an afterthought just because he said gaming is his priority. I notated why it could still matter above. I mention the CPU because whenever he is looking to upgrade. Whether it's in 2020 or 2025. He has something to look for. Maybe it's something as simple a he is getting a new system in 2025. But he wants to use this one for something else, host VM's, do large queues in handbreak, run a bunch of server tools like Exchange, SQL. Or maybe turn it into a dedicated folding machine.
If he gets the 9900k. He will look for a CPU update. If he is really lucky he will find some 10900k or what ever Intel calls the Comet Lake top i9. My personal guess is that the 9900k is the top dog. The end. Won't be terrible but that's what it is.
If he gets the 2920x he will have an insane amount of options. Just in the current TR options it will go up to 32c. Next year 64c. Who knows with Zen 3 and if it's supported (I think it will be). But even then the options are endless. Just looking at the 1950x as Mark brings up 16c CPU that last year was $1000, then $750, and now just over $450. That's a new retail CPU. Heck you can't ever really find that happening on Intel CPU's it basically stays the same price, till intel stops producing it, then several years down the road maybe just maybe you can find a new at a ~$100 saving. So lets say you are the OP, looking for a job in retirement for the system. You are dedicating the machine to folding. Who knows you might be able to get a 64c CPU nib for $500. How great would that be? Or next year with Zen2. He gets the 2920x and in two years the next big video card that comes out and does what no other GPU has ever done and completely generational leaps over new games and makes everything CPU bound at all levels. Well who knows what Zen2 clocks at. My guess is top turbo's hang around 4.3-4.5GHz. Really closing the gap. "IPC" wise AMD makes even a 7% jump all of a sudden it's nipping at the heels of the 9900k. He waits a year and gets a good discount on 3920 (I know they are changing the names but basically saying new approximate CPU) or 3950. Cept not only are they clocking higher. They also are 24c and 32c CPU's. Doesn't that seem like an option to keep open?
Some of that is wishful thinking. Some of that might not apply to the OP. People haven't been upgrading CPU's in 10 years because the upgrades have been small (8 years of 4 cores does that to a generation) or non-existent because of no change in configuration and at most a single gen of upgrade-ability. The 9900k is stuck in the old way. Likely the last and highest supported 390z CPU. If Intel changes their mind for the first time in over a decade. There might be 1, 1 CPU that the OP could get years later to keep the PC puttering around. Not even worth putting the effort into upgrading it. 2920x, maybe he doesn't ever upgrade it. But isn't it great to look in the future see all of these new CPU's that are going to come out in the next couple years and instead of going "well I would need to take out the board and the CPU to do that (and be like me and say might as well just build a new box)" you say "oh wow, that new CPU looks great, maybe if I can get a real good deal on it, I just upgrade my BIOS and I am ready to go"?
I'll give you an example I have given before. My current machine is a 1700, the system before it was a 3930K. I was really happy with 3930k. It was a great CPU. Probably my 3rd favorite purchase next to my 4400+ (Favorite) and my 1700. Anyways long story short. I needed more cores to work on a work project at home. I could have probably made due with the 3930k but some stuff was going on with it. So I was looking to upgrade my CPU. Was going to go with a 6900k. That was going to cost me around 1500-1700 to do that upgrade. New CPU $1000. New board $300-$350. New memory ~$250. I was all prepped and ready to go almost put in the order when HardOCP had a teaser post in their forums basically calling Ryzen a monster. Waited for the benchmarks and ended up building a whole new computer for the cost of that upgrade. That said if Intel had the 6900k able to use the same boards as the 3930k even at it's $1000 vs $330 price. I would have gone with the 6900K. How many people are still using their Sandy Bridge or Ivy Bridge CPU's not because they are still fast enough (and in a lot they are) but because it's just fast enough to make the idea of swapping out those major components not worth it. CPU upgrading is a lost art. I don't want to discount it based of 10 years of upgradibility self sabotage.