I run hwmonitor at all time with cpu and gpu usage data, including individual threads, and i also switch on the fly various oc and underclock settings, so i know what is the bottleneck. I will upgrade to a strong cpu and gpu in half an year, but now have a fairly low end rig, i3 3220 (3.3 ghz) and an oced amd hd6850.
I have an 2560 x1440 monitor, and i game on it.
I run Oblivion with heavy cpu intensive mods, so i get below 100% gpus usage with x4 adaptive AA.
I play x3 Albion, which with x8 AA never gets above 60% usage, however the ship models and laser bullets trajectory hogs the cpu down to 20fps in large battles.
Old games, such as system shock 2, nfs 4 and 5, rune, thief 2, run very well at 1920 x 1440 with x8 ssaa. That is the brute force AA to upsample the resolution x8, it gives great clarity on the inside of the textures, and all the foliage.
I also play semi old games, guild wars 1, unreal tournament 2004, dungeon siege 2, with x4 ssaa with close to 100% gpu usage, but being at 60fps 90% of the time.
Half life 2 cinematic mod v12, a graphical improvement mod, with 4k textures, more complex and detailed environments, or hl2 black mesa mod, both give 60 fps with x2 AA.
Regarding semi new games, its ok, like i get around 45 fps on skyrim with high details, AA off, which actually runs better than oblivion who sucks everything out of the cpu.
Gta 4 and TDU 2 runs at 50 fps with AA off and medium details.
Fallout 3 on max settings with x2 AA and graphical mods hits max usage on both cpu (1 & 1/2 cores threaded game) and gpu, maxing well at 55 fps average, 40 fpa minimums (caused by cpu).
Anno 1404 is gpu bottlenecked at first, giving 45 fps at start of the game, however that becomes irrelevant midgame as large cities make it cpu bottlenecked, and getting easily down to 30 fps.
Its similiar with starcraft 2, i get 40 fps because of gpu bottleneck most of the time, but in large battles where fps matters most, i get below that because of the cpu not keeping up with unit count.
Tomb raider 2013 runs at 40 fps on high settings, which is quite playable, i definetly wouldnt need a r290x to get 60 fps as some would quickly recommend.
Indie games run at 60 fps usually like Amnesia TDD at medium-high settings. Defense Grid or War for the overworld beta dip below 60 fps but that is because of cpu bottleneck actually.
I know you are thinking,what about bf4, or crysis 3, "try that and enjoy your 10 fps !". My point is why cant people enjoy all the older games on a high resolution ? As a personal note, i do not enjoy almost all of the new AAA titles like assassins creed, bf4, thief 2014, Batman: ak, cod, new nfs titles.
I see that as soon as something above 1080p is suggested people go crazy with gpu suggestion. To give an exact number, 2560 x 1440 vs 1920 x 1080 gives 40% less fps, look it up on bencharks with a calculator on any gpu review. To put it the other way around, 1920 x 1080 is 65% faster. ie, it is 50 vs 30 fps.
As crazy as it sounds, even with a lowly 6850 on 2560 x1440, a cpu upgrade will benefit me more on the games i play or intend to play, vs a gpu one. I want to up the minimums on ohlivion and x3 from 20 fps to 30 fps, and to get 60 fps on war for the overworld when its released.
A faster single threaded cpu will also decrease loading times as i have a ssd, and i monitor what the cpu and hdd are doing during load times, and actually a single thread from the cpu is busy at 100% usage uncompressing game data.
I have an 2560 x1440 monitor, and i game on it.
I run Oblivion with heavy cpu intensive mods, so i get below 100% gpus usage with x4 adaptive AA.
I play x3 Albion, which with x8 AA never gets above 60% usage, however the ship models and laser bullets trajectory hogs the cpu down to 20fps in large battles.
Old games, such as system shock 2, nfs 4 and 5, rune, thief 2, run very well at 1920 x 1440 with x8 ssaa. That is the brute force AA to upsample the resolution x8, it gives great clarity on the inside of the textures, and all the foliage.
I also play semi old games, guild wars 1, unreal tournament 2004, dungeon siege 2, with x4 ssaa with close to 100% gpu usage, but being at 60fps 90% of the time.
Half life 2 cinematic mod v12, a graphical improvement mod, with 4k textures, more complex and detailed environments, or hl2 black mesa mod, both give 60 fps with x2 AA.
Regarding semi new games, its ok, like i get around 45 fps on skyrim with high details, AA off, which actually runs better than oblivion who sucks everything out of the cpu.
Gta 4 and TDU 2 runs at 50 fps with AA off and medium details.
Fallout 3 on max settings with x2 AA and graphical mods hits max usage on both cpu (1 & 1/2 cores threaded game) and gpu, maxing well at 55 fps average, 40 fpa minimums (caused by cpu).
Anno 1404 is gpu bottlenecked at first, giving 45 fps at start of the game, however that becomes irrelevant midgame as large cities make it cpu bottlenecked, and getting easily down to 30 fps.
Its similiar with starcraft 2, i get 40 fps because of gpu bottleneck most of the time, but in large battles where fps matters most, i get below that because of the cpu not keeping up with unit count.
Tomb raider 2013 runs at 40 fps on high settings, which is quite playable, i definetly wouldnt need a r290x to get 60 fps as some would quickly recommend.
Indie games run at 60 fps usually like Amnesia TDD at medium-high settings. Defense Grid or War for the overworld beta dip below 60 fps but that is because of cpu bottleneck actually.
I know you are thinking,what about bf4, or crysis 3, "try that and enjoy your 10 fps !". My point is why cant people enjoy all the older games on a high resolution ? As a personal note, i do not enjoy almost all of the new AAA titles like assassins creed, bf4, thief 2014, Batman: ak, cod, new nfs titles.
I see that as soon as something above 1080p is suggested people go crazy with gpu suggestion. To give an exact number, 2560 x 1440 vs 1920 x 1080 gives 40% less fps, look it up on bencharks with a calculator on any gpu review. To put it the other way around, 1920 x 1080 is 65% faster. ie, it is 50 vs 30 fps.
As crazy as it sounds, even with a lowly 6850 on 2560 x1440, a cpu upgrade will benefit me more on the games i play or intend to play, vs a gpu one. I want to up the minimums on ohlivion and x3 from 20 fps to 30 fps, and to get 60 fps on war for the overworld when its released.
A faster single threaded cpu will also decrease loading times as i have a ssd, and i monitor what the cpu and hdd are doing during load times, and actually a single thread from the cpu is busy at 100% usage uncompressing game data.
Last edited:
