What are the best optimized PC games?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lamedude

Golden Member
Jan 14, 2011
1,206
10
81
Anything by id (except Rage's launch). Seeing some 80s PC ports makes you really appreciate Keen's smooth scrolling.
 

zaza

Member
Feb 11, 2015
130
1
0
I'm going to have to pretend that you haven't implied that Skyrim, with perfect V-sync no less, is smooth. Otherwise we've reached an impasse. So moving along, I'm taking perfect V-Sync "optimized" to mean completely smooth under perfect V-Sync. Looking through the games I currently have installed, there is only a tiny handful which I can say are perfectly smooth:

Max Payne 2 (MAX-FX 2)
Unreal, Unreal tournament (UE1)
UT2003/04 (UE2)
Unreal 2 (UE2)
Unreal Tournament 3 (UE3)
Far Cry (CryEngine 1)
Dead Space 3 (Visceral)

These games are very close to it:

FEAR: Would have been in the top list, and used to be one one of my go-to examples of a perfectly smooth engine, but then Nvidia did something to their drivers a year or two ago and now there is some hitching. It's highly predictable, always happening in the same areas.

Battlefield 3: As mentioned before, stuttered at 60fps/60Hz until gametime.maxvariablefps 59.95 was applied. Hitches every now and then (pretty much a non-issue).

Assassin's Creed: DX10 is stutterfest, DX9 is smooth with Pre Ren 1. Mouse input has pixel skipping even on lowest sensitivity, resulting in pseudo-stutter.

Mafia II: At 60Hz/fps the engine is mostly smooth aside from occasional clusters of hitches that feel like caching hitches (go back through the same area and it's smooth). You can induce it by Alt+Tabbing out and back into the game, which seemingly flushes a bunch of data, and the game will stutter for a short while.

Age of Mythology: Very fine hitches intermittently.

The problem is that all this is highly dependent on hardware and software, so your experiences could be completely different. For example, on GTX 500 series cards UT3 hitches (albeit very subtly). UT3 also hitches if you have pre rendered frames and/or the in-game "1 thread frame lag" setting set a certain way. Assassin's Creed DX10 used to not stutter, but some software (probably Nvidia's) change ruined it. Crysis has a stutter bug that only affects Nvidia users, and only appears in a select few areas in two maps (although I would have excluded Crysis anyway because the CPU bottleneck makes perfect V-Sync impossible in many areas).

There are so many variables, and the PC is an ever changing platform. It's hard to recommend anything with certainty.

This is my third attempt at this post. The first two descended into rambling tirades about all the games/engines I know of that aren't smooth. It's far harder to find examples of smooth engines than stuttering ones.

I'm seeing a pattern here. All the well optimized PC games are first person shooters.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
And when I say optimized I mean not just framerate and bugs, I mean those little stutters/freezes/whatever you call them that happen in game. If those are totally absent then the game is perfectly optimized.

These 3 games I know of are perfectly optimized:
Hard Reset
Need for Speed Most Wanted (2012)
Skyrim (without mods)

Someone give me more optimized games to play, please!

This is not what optimized means.

Optimized code is code which achieves a specific output with a specific input in the least amount of CPU cycles, optimized code is code which takes some input and gives an output but could be improved to do it faster.

Low frame rates may be attributed to badly optimized code but that's not necessarily the case, they might be bad for other reasons, the primary one being that what you're calculating is complex and takes a lot of CPU cycles to begin with.

You're conflating performance with optimization.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'm going to have to pretend that you haven't implied that Skyrim, with perfect V-sync no less, is smooth. Otherwise we've reached an impasse. So moving along, I'm taking perfect V-Sync "optimized" to mean completely smooth under perfect V-Sync. Looking through the games I currently have installed, there is only a tiny handful which I can say are perfectly smooth:

Without mods, Skyrim does run pretty smooth on Nvidia hardware. Of course Skyrim was used as a case study of how AMD cards produced dwarf frames, so if you were using AMD or modded up, you likely had a different experience.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
I'm seeing a pattern here. All the well optimized PC games are first person shooters.

You would probably be correct. Most highly optimized engines are produced by AAA developers as they have the manpower and resources to do so, and it is most often shooters that sell most, or are at least the safest bet to produce. An AAA shooter that runs like crap on most mainstream gaming systems of course would generate bad publicity as will poor visuals nowadays too.

Also, Indy developers of even fantastic games rarely possess the manpower to even produce their own engines, let alone optimize it as best as possible.

I am hoping for good things from Unreal 4 in the future, as far as indy development anyway.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
You would probably be correct. Most highly optimized engines are produced by AAA developers as they have the manpower and resources to do so, and it is most often shooters that sell most, or are at least the safest bet to produce. An AAA shooter that runs like crap on most mainstream gaming systems of course would generate bad publicity as will poor visuals nowadays too.

Also, Indy developers of even fantastic games rarely possess the manpower to even produce their own engines, let alone optimize it as best as possible.

I am hoping for good things from Unreal 4 in the future, as far as indy development anyway.

An FPS shooter almost by definition has to run smoothly or it would be useless. We can and do accept less than smooth frames from other genres because those games are still playable although with annoyances.

What I don't understand is how I'm reading that a game like Fallout 4 uses Gamebryo which was built for Xbox 360/PS3 era consoles and that its engine is poorly multithreaded on PC.

Those consoles had 3 and 7 cores respectively and any attempt to "max out the power of the console" would have necessarily required intense multithreading. In my mind the entire game (no pun intended) of optimizing for a console was to figure out how to get 3 or 7 underpowered cores to function like 1 or 2 powerful cores. Multiple weak cores was a more efficient use of silicon which allowed consoles to hit lower price points while offering high theoretical performance.

Yet when these games are ported to PC all of a sudden they are "poorly optimized for multiple cores." We find the game pegging a single core to its limit. All of a sudden these games are for all intents and purposes "singe threaded".

If anything consoles went multicore before mainstream PCs did. So why is it that those older games perform so poorly on today's multiple powerful cores when they ran fine on yesterday's multiple weak cores?

I'm really at a loss to explain this.
 
Last edited:

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Cmon you guys are holding out. Give me some gamez!!
There are literally hundreds / thousands of games that run smoothly from Doom to Deus Ex to Dragon Age Origins to Dishonored to Don't Starve, etc. You need to spell out exactly what games / genre's you like to give people a chance at coming up with something relevant. Old or new? Just FPS or other genres? Are old games which originally stuttered on old hardware at time of launch but run fine now due to patches / better hardware acceptable? Are you talking about only performance related stutter, or are you including non-performance related stutter specific to certain engines? Stutter on both GPU brands or just one? Stutter in general or your system in particular?

As people have said, stutter and optimization are not interchangeable. Some recent engines stutter occasionally no matter what due to the way lazy devs code predominantly for the consoles, and then leave console defaults in the PC port that "streams" data as if it were a last-gen console reading directly off an optical disc. I've found modern Unreal Engine 3/4 games to sporadically stutter no matter what whereas earlier UE1 (Deus Ex) and UE2 (Bioshock 1-2) games were silky smooth even on much slower hardware. Same goes with the Unity engine, even low budget Indie first-person horror games will often run horribly due to using the awful Unity defaults left in. That side of things is down to optimization. But other stutter / micro-stutter is often I/O related, ie, a game may stutter on a 5,400rpm HDD more than it does an SSD, or Crossfire / SLI issues. Or you've got something running in the background. Or it's simply "that kind of game" like Oblivion / Skyrim where you'll never really get 100% flawless 60fps min performance due to the nature of such open-world games that divide the world into invisible cells then "stream in" a large amount of data all at once when loading new cells as you travel across the map. Some games stutter badly with in-game VSync, but disabling that and forcing GFX driver VSync otherwise works fine.

Random stutter is often highly system specific. Eg, my secondary rig is a HTPC / light gaming rig. It used to have a 7790 GFX card in that stuttered like crazy during the first 5 seconds of game on every single UE3 game (Bioshock Infinite, Dishonored, etc). Since then, the card has been side-graded to a 750Ti (barely +10% faster) and now all that stutter has disappeared. The stutter wasn't "performance on a low-end card" issue as I initially believed, it was down to default game engine settings differeing for AMD vs nVidia. And such engines can often be tweaked via config settings files.

Eg for UE3 engine games ini file settings:-
bUseBackgroundLevelStreaming=True
DisableATITextureFilterOptimizationChecks=True/False
UseMinimalNVIDIADriverShaderOptimization=True/False

For Deus Ex Human Revolution:-
In the registry, locate "AllowJobStealing" and change to 0
For NVIDIA owners : locate "\Graphics\AtiForceFetch4" and change to 0

Skyrim:-
There are far too many ini tweaks to list (bUseThreaded*, decals, shadows, shadowmapresolution, x LOD / view distance, etc), but the key is use common sense. By all means install some graphical mods, but don't cram everything in there then complain about stutter. Eg, on low VRAM cards use 2K instead of 4K HD texture packs. Install one decent weather & lighting combo rather than half a dozen conflicting ones (personally, I use ELFX + PureWeather but no ENB's). I also use injected SMAA instead of in game MSAA / FXAA. End result = people have been impressed at how well it runs on a 750Ti (near constant 60fps), whereas some GTX 970's are plagued with sub 40fps stutter purely from the owner getting so carried away with extreme modding that the engine ends up over-saturated.

Playing around with stuff like the above can significantly reduce stutter. And you get to play some great games instead of writing them off on the back of unrealistic expectations of "infallibly perfect 60fps min during every second of game-time". Someone said earlier that Age of Mythology stuttered. I've had that game since it came out (and still have the 2002 retail discs) and haven't seen any such stutter that wasn't related to stuffing in too many units on a slow CPU (the game is single-threaded). You said Bioshock Infinite is still broken. Mine isn't. Early game stuttered like crazy at specific "stream in" invisible checkpoints on the map. One patch improved it, but the later patch has actually eliminated most of it. Now the game runs better on an i3 post-patch than it did on an i7 pre-patch.

Other examples : If you play with VSync on, then Triple Buffering may make the difference between 60fps to 58fps slowdown = either a 2fps drop (TB on) or a 30fps drop (TB off), etc. D3DOverrider may fix some such problems with games not supporting Triple Buffering. Some people close their web browsers (or at least flash heavy tabs before gaming), others don't. Some GFX card + driver combos can mess with short "glitching" of lowering GPU clock frequencies to idle whilst gaming (one of several reasons why reason I ditched the 7790 for a 750Ti), many others don't. Your choice of 4-8x MSAA anti-aliasing may hit a VRAM limit before someone else's choice of FXAA / SMAA (may use up to 200-300MB VRAM less which gives them 10-15% more overhead on a 2GB card), etc. So many different variables like that can mean that a game that stutters for you on your system may not stutter for someone else on theirs.
 

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
Definitely at release, but since then Skyrim had a lot of optimizations, both within the game itself and with drivers. So it's pretty well optimized now.

Skyrim was horrible at launch. I remember they shipped it with some debug flag accidentally set or whatever causing it to run much slower, especially on slower CPUs. Some modder then managed to hack it for a 50% frame-rate increase. This was later fixed with a patch. The biggest reason it runs so good now is that it came out 4 years ago. Most are now running hardware that would have been unbelievably powerful back then.

Some games that run great and manage to look good are IMO The Vanishing of Ethan Carter, Elite Dangerous and Euro Truck Simulator 2.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Explain the Triple Buffer thing. I remember Koroush Ghazi recommending D3DOverrider enforced Triple Buffering for Skyrim, but I never understood why. The game clearly already uses some kind of triple buffering when V-Sync is enabled, just watch what the framerate does when it drops below a 60Hz refresh. It doesn't drop to 30fps. They couldn't have not included TB in a game released with V-Sync forced on by default.
Sorry I should have worded that better. I didn't mean Skyrim in particular has a Triple Buffer problem, just that some games in general can be fixed via D3D Overrider which may solve one or two of the OP's other "problem games".
 

Oyeve

Lifer
Oct 18, 1999
22,005
863
126
The only game in recent history that has never crashed, hiccuped, lagged, froze et al..... has been Mad Max. I have around 120 hours in that game and it has worked perfectly since day one install.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
An FPS shooter almost by definition has to run smoothly or it would be useless. We can and do accept less than smooth frames from other genres because those games are still playable although with annoyances.

What I don't understand is how I'm reading that a game like Fallout 4 uses Gamebryo which was built for Xbox 360/PS3 era consoles and that its engine is poorly multithreaded on PC.

Those consoles had 3 and 7 cores respectively and any attempt to "max out the power of the console" would have necessarily required intense multithreading. In my mind the entire game (no pun intended) of optimizing for a console was to figure out how to get 3 or 7 underpowered cores to function like 1 or 2 powerful cores. Multiple weak cores was a more efficient use of silicon which allowed consoles to hit lower price points while offering high theoretical performance.

Yet when these games are ported to PC all of a sudden they are "poorly optimized for multiple cores." We find the game pegging a single core to its limit. All of a sudden these games are for all intents and purposes "singe threaded".

If anything consoles went multicore before mainstream PCs did. So why is it that those older games perform so poorly on today's multiple powerful cores when they ran fine on yesterday's multiple weak cores?

I'm really at a loss to explain this.

The answer is one part different hardware, one part limited API, and one part laziness. The 360 and the PS3 used IBM PowerPC and Cell CPU architectures, respectively, different than AMD and Intel's x86 architecture used in PCs (extremely so in the Cell's case). Multithreading techniques used on the consoles wouldn't necessarily work on PC, especially if the API wouldn't cooperate. Developers were able to make multithreading optimizations work on consoles because they had very flexible, "close to the metal" access to how the hardware was rendering the game. On PC, DirectX eschewed such close access in favor of broad compatibility and ease of programming, making it much harder for developers to get those optimizations to work on PC, if it was possible at all. Compounding these problems was probably a sense of laziness. PC players don't have to rely on developers to eke out every last bit of performance in the same way console players do, after all. If a game doesn't run well on a player's PC, they have the option to upgrade. They aren't locked in to a single set of specifications and can have a great experience if they pay enough money, so why should the developer put the extra time and resources in on their end?

A couple things have happened with the 8th generation of consoles to counteract this. The consoles use x86 processors, and PC developers are starting to use more close-to-the-metal APIs like Mantle, Vulkan, and DirectX 12. This should make bringing over multithreading optimizations from consoles much easier. But the laziness part...that will come down to each individual developer. Some developers won't rest until they've eked all the performance they can out of optimizations for PC, and some don't care. Bethesda...has never really belonged to the former group, unfortunately.

For the record though, Fallout 4 does not use Gamebryo. It uses the "Creation Engine", which was first used with Skyrim. It's probably descended from Gamebryo, but is not itself Gamebryo.
 
Last edited:

zaza

Member
Feb 11, 2015
130
1
0
The only game in recent history that has never crashed, hiccuped, lagged, froze et al..... has been Mad Max. I have around 120 hours in that game and it has worked perfectly since day one install.

I look forward to playing it :)
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
The answer is one part different hardware, one part limited API, and one part laziness. The 360 and the PS3 used IBM PowerPC and Cell CPU architectures, respectively, different than AMD and Intel's x86 architecture used in PCs (extremely so in the Cell's case). Multithreading techniques used on the consoles wouldn't necessarily work on PC, especially if the API wouldn't cooperate. Developers were able to make multithreading optimizations work on consoles because they had very flexible, "close to the metal" access to how the hardware was rendering the game. On PC, DirectX eschewed such close access in favor of broad compatibility and ease of programming, making it much harder for developers to get those optimizations to work on PC, if it was possible at all. Compounding these problems was probably a sense of laziness. PC players don't have to rely on developers to eke out every last bit of performance in the same way console players do, after all. If a game doesn't run well on a player's PC, they have the option to upgrade. They aren't locked in to a single set of specifications and can have a great experience if they pay enough money, so why should the developer put the extra time and resources in on their end?

A couple things have happened with the 8th generation of consoles to counteract this. The consoles use x86 processors, and PC developers are starting to use more close-to-the-metal APIs like Mantle, Vulkan, and DirectX 12. This should make bringing over multithreading optimizations from consoles much easier. But the laziness part...that will come down to each individual developer. Some developers won't rest until they've eked all the performance they can out of optimizations for PC, and some don't care. Bethesda...has never really belonged to the former group, unfortunately.

For the record though, Fallout 4 does not use Gamebryo. It uses the "Creation Engine", which was first used with Skyrim. It's probably descended from Gamebryo, but is not itself Gamebryo.

It would seem that DirectX not exposing the hardware would be a main factor. But it also seems that the Xbox 360 "suffered" from that problem also:

http://www.eurogamer.net/articles/digitalfoundry-directx-360-performance-blog-entry

If they could get their games to max out 3 cores on the Xbox 360 with DirectX surely they could have done something with 4 x86 cores. It leaves only developer laziness and a total disdain for the PC market as the main reason. It's almost as if after they make a game they just do the minimum possible to "make it run" on a PC and take in some marginal revenue.

Thankfully things are looking better for PC and it seems developers are starting to take it more seriously.


Regarding the topic:

For a well optimized game for PC today - try Dirt Rally. This game is PC only for the time being. Meaning that they are optimizing fully for PC and then they will gimp it out for the consoles - as it should be done. It is early access and if you have a problem with that business model then don't look. But this game is turning out to be an argument for the early access model done right. It looks beautiful and runs very well on my PC.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Explain the Triple Buffer thing. I remember Koroush Ghazi recommending D3DOverrider enforced Triple Buffering for Skyrim, but I never understood why. The game clearly already uses some kind of triple buffering when V-Sync is enabled, just watch what the framerate does when it drops below a 60Hz refresh. It doesn't drop to 30fps. They couldn't have not included TB in a game released with V-Sync forced on by default.

If you use multi-GPU's, you always get some form of triple buffering, like it or not. Are you, or were you using triple buffering with Skyrim?

Also, triple buffering is only useful with V-sync on when you are not reaching your refresh rate. It doesn't change anything without V-sync, and if you have it on at your refresh rate, it adds a frame of latency.