When a 2600k OC isnt fast enough...list some CPU limited games

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
Use 100fps as the mark to hit. Assume the person has a 2600k@4.5 and a GPU which can pull 100fps or more. What games can't the 2600k OC handle at around 100fps? I still think the 2600K is more than enough for high refresh gaming in nearly all titles. Any game the 2600K can't handle, then newer CPU's will struggle as well. Prove me wrong.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Arma III, parts of Guild Wars 2, maybe a few areas in Battlefield.
 
Feb 25, 2011
16,992
1,621
126
I'm sure you could find a lot of old single-threaded games where the GPU requirements are low enough that you're limited by CPU performance. Rome: Total War (not 2) comes to mind as something that bogs on every computer I've ever owned, if you do a big enough custom battle with hundreds of plus-sized units. (No fair re-enacting Stalingrad with Legionnaires, basically.)

Civ V was kinda similar. (very sensitive to ST clockspeed.)
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Mod any of Bethesda's games:

Morrowind + MGE
Oblivion + Mart's Monster Mod + ENB + ugrids=9 ini tweak
Tale Of Two Wastelands (merge of Fallout 3 + Fallout New Vegas) + triple spawns + ENB + ugrids=9 ini tweak
Skyrim + ENB + triple spawns + ugrids=9 ini tweak

I doubt you'll ever not drop below 30 on a fully kitted out Oblivion, even with a superclocked i7 2600k.
 
Last edited:

[DHT]Osiris

Lifer
Dec 15, 2015
17,250
16,474
146
Mod any of Bethesda's games; Morrowind + MGE, Oblivion + Mart's Monster Mod + ENB + ugrids=9 ini tweak, Tale Of Two Wastelands (merge of Fallout 3 + Fallout New Vegas) + triple spawns + ENB + ugrids=9 ini tweak, and Skyrim + ENB + triple spawns + ugrids=9 ini tweak

I doubt you'll ever not drop below 30 on a fully kitted out Oblivion, even with a superclocked i7 2600k.

This, modded Beth games have a legendary capacity to crush systems. I've had mod layouts that made Crysis look positively consoley in comparison.

EDIT: Oh, and just to make my own contribution: an active, well developed, 200 dwarf DF instance. There is no escaping CPU entropy in that game, and the nearest thing to 'success' you can find is your dwarves lasting long enough to fall victim to the CPU clock monster.
 

IndyColtsFan

Lifer
Sep 22, 2007
33,655
687
126
Mod any of Bethesda's games:

Morrowind + MGE
Oblivion + Mart's Monster Mod + ENB + ugrids=9 ini tweak
Tale Of Two Wastelands (merge of Fallout 3 + Fallout New Vegas) + triple spawns + ENB + ugrids=9 ini tweak
Skyrim + ENB + triple spawns + ugrids=9 ini tweak

I doubt you'll ever not drop below 30 on a fully kitted out Oblivion, even with a superclocked i7 2600k.

Wouldn't an overclocked 6700K (assume the same clock as the 2600K in the example) also fall below 30 fps on those games?
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Wouldn't an overclocked 6700K (assume the same clock as the 2600K in the example) also fall below 30 fps on those games?

Dunno. Nobody bench marks these games, despite them being the perfect CPU benchmark; in any of them, jank ugrids up to 9, install a triple spawns mod, and any number of other things that look cool on the Nexus Mods site.

Go to some place with bandits/raiders/fiends, and watch your framerate go to the dogs. We need benchmarks for that stuff.
 

IndyColtsFan

Lifer
Sep 22, 2007
33,655
687
126
Dunno. Nobody bench marks these games, despite them being the perfect CPU benchmark; in any of them, jank ugrids up to 9, install a triple spawns mod, and any number of other things that look cool on the Nexus Mods site.

Go to some place with bandits/raiders/fiends, and watch your framerate go to the dogs. We need benchmarks for that stuff.

I agree.

I think Moonbogg is in the same boat I'm in - we're both still on Sandy Bridge CPUs and have struggled to justify upgrading given the fact that our CPUs still seem to perform very well. On one hand, it is good not spending money but on the other hand, when this has been your hobby for 30+ years (at least in my case), it sucks NOT being able to spend money on your hobby. :D You could argue that in my case, the HEDT platform would be an upgrade due to the extra cores but I don't think the pricing justifies it, nor do I think the extra cores will materially change my experience in 90% of the things I do.
 
  • Like
Reactions: poofyhairguy

[DHT]Osiris

Lifer
Dec 15, 2015
17,250
16,474
146
Dunno. Nobody bench marks these games, despite them being the perfect CPU benchmark; in any of them, jank ugrids up to 9, install a triple spawns mod, and any number of other things that look cool on the Nexus Mods site.

Go to some place with bandits/raiders/fiends, and watch your framerate go to the dogs. We need benchmarks for that stuff.

A big problem I can imagine is the consistency. A lot of the FPS drops are going to be from the CPU just churning through whatever's happening with AI, physics stuff, etc. You'd never end up with 2 exact benchmarks, so you'd have to average it a bit. Those averages will probably have a lot of peaks and troughs, but it would work for a 'best guess' kind of benchmark i guess.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,574
10,211
126
You could argue that in my case, the HEDT platform would be an upgrade due to the extra cores but I don't think the pricing justifies it, nor do I think the extra cores will materially change my experience in 90% of the things I do.

That's basically how I felt about quad-core CPUs for the longest time, too. When I can get better ST IPC with an super-overclocked dual-core, and what I do isn't heavily threaded, it just made sense to stick with dual-cores, for budget reasons.

Edit: Sorry, I was talking about non-gaming tasks. Since this thread is about gaming, I'll bow out.
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
A big problem I can imagine is the consistency. A lot of the FPS drops are going to be from the CPU just churning through whatever's happening with AI, physics stuff, etc. You'd never end up with 2 exact benchmarks, so you'd have to average it a bit. Those averages will probably have a lot of peaks and troughs, but it would work for a 'best guess' kind of benchmark i guess.

Not true, really. All you have to do is keep the camera in the same position, and use the same save file. The former is done by entering "DisablePlayerControls" in the console, and the latter is just ctrl+c ctrl+v.

Minimums are what count; the minimum framerate is what ya cap your FPS to, unless you have a softspot for judder and stutter due to oscillating framerates.
 

[DHT]Osiris

Lifer
Dec 15, 2015
17,250
16,474
146
Not true, really. All you have to do is keep the camera in the same position, and use the same save file. The former is done by entering "DisablePlayerControls" in the console, and the latter is just ctrl+c ctrl+v.

Won't you have issues with AI creating different scenarios in that instance? I don't mean different spawns/whatever, but physically doing different things, therefore using different CPU cycles?
 

DrMrLordX

Lifer
Apr 27, 2000
22,755
12,760
136
Forza Horizen 3 maybe?

https://forums.anandtech.com/threads/is-my-cpu-bottlenecking-forza-horizon-3.2491143/

Here's a relavent table for you:

f3_proz.png


Anyway, 2600k @ 3.4 GHz is only doing 59 FPS min, prolly not gonna hit 100 fps min @ 4.5 GHz (78.1 fps min). According to the 6700 results above, one running @ 4.8 GHz (100.2 fps min) will probably get you there, assuming that the 1080 will go there with you and otherwise perfect scaling. Regardless you are looking at a ~20% increase in FPS moving to Skylake. That's significant.
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Won't you have issues with AI creating different scenarios in that instance? I don't mean different spawns/whatever, but physically doing different things, therefore using different CPU cycles?

You give Bethesda too much credit. Feel free to try it yourself; spawn 50 draugr and 50 bandits through the console, save, and watch your framerate. Once the battle ends, exit the game, load it back up and load the save.

Yer framerate should be damn near the same, so long as you keep your camera steady.
 

[DHT]Osiris

Lifer
Dec 15, 2015
17,250
16,474
146
You give Bethesda too much credit.

Might be more a credit to them, if varying scenarios don't have substantial impacts on the FPS of a given timeframe.

But fair enough, a modded 'thesda game would make a good test bench for CPU vs CPU, and GPU vs GPU comparisons, as well as throttling under x circumstances (high vRAM loads, heavy multithreading requirements, etc).
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Might be more a credit to them, if varying scenarios don't have substantial impacts on the FPS of a given timeframe.

But fair enough, a modded 'thesda game would make a good test bench for CPU vs CPU, and GPU vs GPU comparisons, as well as throttling under x circumstances (high vRAM loads, heavy multithreading requirements, etc).

GPU Benchmark? Set the AO quality to extreme (-1), AO scaling to 1.0, and enable indirect lighting. Bonus points if you just set everything to extreme quality.

CPU benchmark? Loads of spawns (AI, physics, draw calls, spell scripts), ugrids=9 (more AI, physics, draw calls, quest scripts, collision scripts, spawners, sound emitters, etc), and maybe ENB due to the draw call intercepts taking up a bit more CPU time.

Real single-threaded prowess is best measured with Morrowind and cranked up MGE settings. Compiled for x87 (SSE if you run the exeopt mod on it), wholly single-threaded (Oblivion will make a little use of a second core) and runs like a dog when the draw calls start tipping towards a thousand.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
World of Warcraft (raiding, outdoor raids, outdoor mass pvp with many players), world of tanks, GTA5 with mods, pretty much have to name games where a Sandy Bridge doesn't bottleneck a 10 series GPU :p
 
  • Like
Reactions: Head1985

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think you are going about this all wrong. Who cares what games exist that will bottleneck you if you aren't playing them. Are you bottlenecked now in the games you play now? If not, don't upgrade. If you are, is the increase in FPS worth the upgrade cost to you?

Don't forget to turn down settings in games you turned them up, so your GPU wasn't CPU bound. If you would never do that, then you don't have to worry about it in that game anyway.

I went from an i7 920 @ 3.8Ghz to an i7 5820k @ 4.4Ghz and saw a big improvement in many games, but I also turned settings down in order to get to a minimum of 75 FPS if possible.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
World of Warcraft (raiding, outdoor raids, outdoor mass pvp with many players), world of tanks, GTA5 with mods, pretty much have to name games where a Sandy Bridge doesn't bottleneck a 10 series GPU :p
+All MMOs, Watchdogs, Crysis3(grass level welcome to the jungle), rome total war1, rome total war2, atila total war, empire total war(just all total war games)
pretty much all bethesda moded games:fallout3, fallout4, Morrowind, oblivion, skyrim, skyrim SE
Mafia3 was pretty hard on CPU before update
 

amenx

Diamond Member
Dec 17, 2004
4,429
2,754
136
All this CPU talk on gaming performance.... again focused on 1080p. How about CPU performance on games running @ 1440p? Review sites continue to this day sticking to 1080p as the only valid res for evaluating CPU gaming performance. For the supposed (and silly) reason that any res above that has minimal effect. Maybe, but I want to see evidence of that in several titles on 1440p. For some that may make the difference between choosing a CPU upgrade or a monitor upgrade.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I would argue that any game that runs bad on a 2600k@4.5 will also run less than ideal on any CPU out there. If a game gets 40fps on a 2600K then its not going to be awesome on a 6700K either. It means the game is poorly threaded and runs like crap. The CPUs are not so far apart to make a world of difference. If a game is bottlenecked by an OC'd 2600k, then a 6700k may barely be adequate itself. If a faster CPU than skylake came out today, then people tomorrow would be saying how the 6700k is a bottleneck because the new CPU gets more FPS is certain situations.
I claim that the 2600K @ 4.5ghz and above is still an ideal gaming CPU. Sandy Bridge is beautiful. Say it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^Replace 2600K and 6700K with "i7 920/930 OC and 2600K". Using the argument you have outlined, you can just as easily claim that all games which drop to 40-50 FPS on an i7 920 OC compared to an i7 2600K OC are poorly optimized.

It makes no sense to me to buy 2015-2016 $650 GPU flagships (esp. in SLI), $1200 100-144Hz 1440p monitor, and then complain that a January 2011 CPU (!!!) is a bottleneck. Ya, that's how old 2600K is.

If you held out this long and plan on getting 1080Ti(S), just get Kaby Lake 7700K or Skylake-X in 2H 2017, or discounted 6700K. The CPU bottleneck will only increase with a shift to 2017 GP102 1080Ti (or w/e it's called) and in 2018 to Volta.

However, if performance of Sandy is satisfactory for you, and you plan on skipping 2017 GPUs straight to Volta in 2018, then just hold out to Ice Lake.

I am actually not that thrilled about Kaby Lake or Skylake-X. The former is going to be like Devil's Canyon was to 4770K. The latter is an August 2015 Skylake architecture launching 2 (!!!) years late(r). When I did my research on Intel's roadmaps back in 2014-2015, I knew it was time to dump my Sandy while it had value and move to Skylake since everything until Ice Lake would be more of the same. That's exactly what's happening. I figured I can enjoy 6700K for 2-3 years before Ice Lake, so why wait?

Now you are going to run into a dilemma. If you buy Kaby Lake, Cannonlake, Coffee Lake, or Skylake-X, every single of these will likely lack DMI 4.0, PCIe 4.0, PCIe 3.0 x8 M.2. All of these will be last gen architecture once Ice Lake launches. You won't be as satisfied inside if you buy a 2-year-old Skylake-X architecture in 2H 2017 when you are basically paying 2017 prices for 2-year-old tech, just more cores. And again, these will be "obsoleted" by Ice Lake in as little as 12-18 months from Q3-4 2017.

Until December 4th, MicroCenter has a 6700K for only $259.99. This is a great stop-gap between now and Ice Lake. i7 7700K will cost $329-349 and it will be short-lived since Cannonlake and 300 series chipsets are launching Q4 2017-Q1 2018. I guess it just depends on how long you want to keep your next CPU platform. If you want to use it for 5-7 years like Sandy, then I'd lean towards 6-core Skylake-X. If you don't mind reselling parts in 2-3 years, the deals on 6700K and Z170 boards are great right now!
 
Last edited:
  • Like
Reactions: jana519

IndyColtsFan

Lifer
Sep 22, 2007
33,655
687
126
^Replace 2600K and 6700K with "i7 920/930 OC and 2600K". Using the argument you have outlined, you can just as easily claim that all games which drop to 40-50 FPS on an i7 920 OC compared to an i7 2600K OC are poorly optimized.

It makes no sense to me to buy 2015-2016 $650 GPU flagships (esp. in SLI), $1200 100-144Hz 1440p monitor, and then complain that a January 2011 CPU (!!!) is a bottleneck. Ya, that's how old 2600K is.

If you held out this long and plan on getting 1080Ti(S), just get Kaby Lake 7700K or Skylake-X in 2H 2017, or discounted 6700K. The CPU bottleneck will only increase with a shift to 2017 GP102 1080Ti (or w/e it's called) and in 2018 to Volta.

However, if performance of Sandy is satisfactory for you, and you plan on skipping 2017 GPUs straight to Volta in 2018, then just hold out to Ice Lake.

I am actually not that thrilled about Kaby Lake or Skylake-X. The former is going to be like Devil's Canyon was to 4770K. The latter is an August 2015 Skylake architecture launching 2 (!!!) years late(r). When I did my research on Intel's roadmaps back in 2014-2015, I knew it was time to dump my Sandy while it had value and move to Skylake since everything until Ice Lake would be more of the same. That's exactly what's happening. I figured I can enjoy 6700K for 2-3 years before Ice Lake, so why wait?

Now you are going to run into a dilemma. If you buy Kaby Lake, Cannonlake, Coffee Lake, or Skylake-X, every single of these will likely lack DMI 4.0, PCIe 4.0, PCIe 3.0 x8 M.2. All of these will be last gen architecture once Ice Lake launches. You won't be as satisfied inside if you buy a 2-year-old Skylake-X architecture in 2H 2017 when you are basically paying 2017 prices for 2-year-old tech, just more cores. And again, these will be "obsoleted" by Ice Lake in as little as 12-18 months.

Until December 4th, MicroCenter has a 6700K for only $259.99. This is a great stop-gap between now and Ice Lake. i7 7700K will cost $329-349 and it will be short-lived since Cannonlake and 300 series chipsets are launching Q4 2017-Q1 2018. I guess it just depends on how long you want to keep your next CPU platform. If you want to use it for 5-7 years like Sandy, then I'd lean towards 6-core Skylake-X. If you don't mind reselling parts in 2-3 years, the deals on 6700K and Z170 boards are great right now!

Also, for those without an MC close by, Fry's was advertising the 6700K for $269. Not sure how long that deal lasts.

I hear what you're saying. I've been debating the 6700K ever since it launched. I held out because my 2600K still performed well, and I wanted to see what BW-E offered. Well, IMO, BW-E was a huge disappointment and I didn't see any major issues with my 2600K in the intervening months. That, and the fact that my gaming isn't what it used to be, makes me want to hold out for Ice Lake. Though I hope I am wrong, I am pretty sure I'm going to be disappointed in SKL-X so unless my system dies, Ice Lake looks more and more likely for me. I won't be surprised of Cannonlake, Coffee Lake, and even Ice Lake are delayed further, however.
 
  • Like
Reactions: RussianSensation