Yay, another person with a Qnix monitor! Had mine for years, and recently retired it. At this point, with that card you'll be GPU bound anyway regardless of what processor you went with. Hang in there for Zen 2, and whatever Lake Intel has at that time and see.System is in the sig, I am wondering when to make a move? I game at 2k which is fine for me. I want to upgrade when I would get the biggest bang for my buck.
Thanks in advance
And as far as the "moar cores" thing goes, I think that's kind of a red herring. Intel (and to a lesser extent, AMD) has grown beyond Haswell's gaming IPC. I think you can actually game faster on a 2700x than you can a stock 4790k, and possibly even faster than a "typical" 4.5-4.7 GHz overclocked 4790k. I know a Coffee Lake 8700k or 9900k at the same clockspeed will get you higher IPC than that Haswell, guaranteed.
There are some titles out there, right now, that you may struggle to hit 80 fps in with that 4790k. And that's not considering games like Total War etc. where there are other CPU-related factors.
If you are going to get a 6c or 8c CPU, look first at the relative IPC of the chip in question versus your Haswell, and then decide 75% based on that. The extra cores are just the icing on the cake.
Basically,for gaming, an OC 4790K ~= i5 8400 / OC Ryzen 5 or 7. The only truly worthwhile chips worth upgrading to would be the costly ones - we are talking 8700K/9700K/9900K here, and to a lesser extent a 8600K/9600K. And that is a heck of an expense for a max ~10% gain at 1440P.
I would say that's mostly fair. The main gains would be in higher minimums, I think. Also I kind of wonder if that 1070Ti will prevent sustaining 80 fps minimums at 1440p. But I do know, a 4790k will probably not pull it off with a faster dGPU.
A 8700K would indeed provide a tangible increase in framerates, but we are still talking about less than a 20% difference at 1080P *medium* settings on a 1080 Ti: https://www.techspot.com/review/1546-intel-2nd-gen-core-i7-vs-8th-gen/page5.html
At 1440P, which the OP games at, that margin is probably going to shrink to 10% or less as it is much more GPU bound than 1080P.
I wouldn't even consider Ryzen at this point as a true 'upgrade' for gaming compared to a 4790K. As the charts above show, a 4770K/4790K @ 4.8GHz is basically equal to an i5 8400... which in turn is basically equal to an overclocked Ryzen 2600 @ 4.2GHz https://www.techspot.com/review/1627-core-i5-8400-vs-ryzen-5-2600/page8.html
Yes, the overclocked 2600 is a bit faster than the 8400 but that's because it's using DDR4-3400 against DDR4-2666 for the i5 8400. Pair the same memory for both CPUs and they will effectively be equal.
Basically,for gaming, an OC 4790K ~= i5 8400 / OC Ryzen 5 or 7. The only truly worthwhile chips worth upgrading to would be the costly ones - we are talking 8700K/9700K/9900K here, and to a lesser extent a 8600K/9600K. And that is a heck of an expense for a max ~10% gain at 1440P.
There are things that benchmarks do not measure. Like smoothness of the CPU when gaming or rendering. The newer CPU's have the advantage over old CPU's. I think there are enhancements made to the CPU architecture that are not accounted for in benchmarks.
No.Will my eyes actually be able to tell the difference between 130 fps and 144 fps?
Not necessarily true. There is a visible advantage to being locked onto the max refresh rate of the monitor panel for monitors that do not have adaptive sync. So 130 to 144hz would be visible if 144hz is the max refresh of that panel and you're not running adaptive sync.
Not necessarily true. There is a visible advantage to being locked onto the max refresh rate of the monitor panel for monitors that do not have adaptive sync. So 130 to 144hz would be visible if 144hz is the max refresh of that panel and you're not running adaptive sync.
This isn't because 130 fps vs 144 fps is easily noticeable on its own, rather that when you cap the panel's refresh rate (w/ vsync) you will see some beneficial frame timing differences. It will appear smoother due to lining up with the panel better.
This is all moot if you're running adaptive sync. That would make it nearly impossible to notice a difference between 130 and 144.
I have no idea what you are talking about. What is this 'smoothness'?
100fps = 100fps, there isn't some magic sauce in new CPUs that makes games run 'smoother'. If they are smoother, it is because of higher framerates (particularly minimums), simple as that.
Average FPS is irrelevant to smoothness. A game running at 60fps can seem smother than a game that average 200fps with drops into the 20s. Newer CPUs score better on frame times so they can make games seem smother regardless of the average frame rate.
Yay, another person with a Qnix monitor! Had mine for years, and recently retired it. At this point, with that card you'll be GPU bound anyway regardless of what processor you went with. Hang in there for Zen 2, and whatever Lake Intel has at that time and see.
I have that monitor too but mine was at 100% brightness and I couldn't lower it. Eventually got tired of it and up/ down/ sidegraded to a TN 2560x1440 24" 144hz display.
Love the 144hz. Miss IPS. And I'm not sure about 27 vs 24". I kinda like 24 when gaming but 27 when using photoshop.