Using outdated hardware, why do new games look worse than old ones?

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
A phenomenon I have noticed is that if you take an older card that used to be high end and make it play an older game that it can just barely play at max details smoothly that it will look better than playing a recent game with graphics turned down enough to be playable on the same card.

Example: Radeon 9700 Pro. It could run Half Life 2, Doom 3, and Far Cry at 1024x768 with max details and pull 30fps or higher. It looked amazing in its day, still looks very decent. If I were to run Crysis or something newer on this 9700 Pro, turning the game settings down enough to run on the card would make the game look worse than the 3 aforementioned titles almost certainly.

The video card doesn't lose the ability to handle graphics over time, but games look worse and worse on it over time. Is this due to lazy developers or what?

Just something I was thinking about.
 

InflatableBuddha

Diamond Member
Jul 5, 2007
7,416
1
0
I think this has to do with older cards' lack of support for newer graphical standards (e.g. SM 3.0). As new graphics cards come out, they're not just faster, but they support new features that only newer games use.

Thus, on the newer games, you need to turn down settings and disable some of the newer graphical features since older cards don't support them.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Makes sense. I guess it really does come down to developers not feeling like coding for SM2.0 or whatever feature older cards may run on that a newer game lacks.
 

brittlenet

Junior Member
May 9, 2002
23
0
0
In a more general sense, the developers aren't optimizing image quality for the low end, because it's the high-end sexiness that drives good reviews which leads to sales.
 

dflynchimp

Senior member
Apr 11, 2007
468
0
71
The reason's pretty obvious. Current games are designed to be run with at least a reasonable amount of eye candy turned on. If all the customers were to turn their graphical settings down it'd void the producer's original intent of pushing the graphical envelope (and discrete gpu sales).

You could also look at things this way.

Each game has several components, or layers of rendering. It starts with polygons, then we put meshes and textures over them, and finally we apply shaders. For in game interactivity such as physics the engine must be designed to either run full powered with sufficient resources, medium speed with some features disabled or completely turned off to save resources.

On the most basic level, modeling, such as polygon count and geometric calculations, we know that games such as Crysis sport a base polygon count many times higher in density than that of Half life 2, Doom 3 and Far cry at any given resolution. This by default will put more strain on the hardware, regardless of specific optimizations.

Texture sizes have also exploded in recent years as gpu's are embued with more and more RAM. the 9700 had a paltry 128mb (by today's standards) of DDR ram which also sported much lower bandwidth, so games developed in the same era were coded to maximize but not exceed the capabilities of the specs. Games such as Crysis, which were designed to take advantage of 512mb or more of DDR3/DDR4/DDR5 memory not only had quadruple (sometimes more) the memory buffer available to load textures onto, but also had a much higher bandwidth leeway courtesy of the faster gRam and the PCI-E X16 (and 2.0) interface. Obviously for a game designed to indulge in excess spending of system resources to run on an older system with a fraction of the computing power it would have to make severe compromises that weren't part of its intended programming. This probably results in some textures not even being loaded due to lack of system space, and for alot of models that require many small intricate textures to provide eye candy this is devastating.

Physics and shadows didn't even used to be a major issue. I remember when Doom 3 came out with with its "Ultra graphics" setting gpu-killer, the shadows and shading was considered revolutionary. Soon afterwards those effects became a commodity, than quickly outdated in light of soft shadows, enhanced ambient lighting and all the other shading gimmicks that we see today. Physics was also a big selling point of Half Life 2, although by today's standards they were pretty simplistic and wouldn't hold a candle to Nvidia's PhysX (although havok has come a long ways since then). The complexity of these operations are several orders of magnitudes more resource heavy in their current forms than in their initial 9700pro era states.

The answer is older games were built around older graphic utilities and hardware, which can obviously be expected to run those with as little wasted resources as possible. But when an older hardware is paired with modern coding, some of it far from the old hardware's design envelope or completely beyond what the architects had envisioned, alot of it will simply not be run, resulting in the slimfast version of modern games looking worse than the older games.