Tomb Raider 2013 still looks fantastic

Red Hawk

Diamond Member
Jan 1, 2011
3,237
144
106
So I recently played the heck out of Star Wars Jedi Fallen Order, thoroughly enjoyed it. That game got compared to a lot of other games, one of them being the modern Tomb Raider games for the map interaction and exploration and the set pieces. I bought Tomb Raider 2013 back when it came out in 2013, never played it much though. But one thing I always appreciated about it was the advanced effects on display. It came out at the tail end of the 360/PS3's life cycle, when some developers were getting their feet wet with DirectX 11 and trying out the effects that would be used on the XB1/PS4. And I gotta say, the added detail is greatly appreciated. The lighting is really well executed and perfectly compliments the early game sequences at night with lots of torches and fire illuminating the environments. Environmental detail and foliage is nice and dense. Geometry and tessellation on objects and characters is enhanced. High quality shadows and depth of field are used. The vaunted "TressFX" on Lara's hair is a nice touch.

Does it look as refined as a modern game? No, for sure; by comparison Jedi Fallen Order has a more advanced lighting model, water and cloth physics. It has hair effects applied to more than just the main character (and without any proprietary code from Nvidia or AMD). Characters and objects do look a bit more plastic-y than modern games, as the game predates the advent of physically based rendering (PBR) for materials. Textures are lower res. However, that serves to give the game more headroom for the built-in supersampling mode. It's something AMD was pushing at the time, also seen in Sleeping Dogs (another game published by Square Enix, coincidentally). It lets you clean up the image of jaggies without blurring things up with post process AA, and without resorting to graphics driver tricks like Virtual Super Resolution. It particularly goes well in making sure the TressFX hair remains clean and distinct. Of course it was prohibitively performance intensive at the time, but modern GPUs can handle it in stride. On my 5700 XT I get between 90-70 FPS with everything maxed out, and the game looks really nice. Holds up a lot better than games from the same time that settled on DirectX 9 level effects no better than consoles...I'm looking at you, Mass Effect 3.

What are some games you guys think hold up as ahead of their time thanks to developers going the extra mile with enabling cutting edge effects on PC? (No, Crysis doesn't count. :p)
 

Stuka87

Diamond Member
Dec 10, 2010
4,768
501
126
I just played through it for the first time (got it for free on Steam) and I really enjoyed it. It still holds up very well. I ran it fully maxed with super sampling with out a hitch. One of the nice things about playing games well after they came out, better hardware is available.

Probably going to go and play the second and third one when they go on sale.
 

Zenoth

Diamond Member
Jan 29, 2005
5,046
89
91
Specifically on PC? So console ports we're talking about? :)

Well, there's a good bunch to be honest, it's also quite a subjective thing. It may depend on how you approach this, for example I could say something like Battlefield 2 still looks good today, but technologically-speaking there's no way it "looks better" (or has more engine update shenanigans) than Battlefield 3. The same with Crysis 1 compared to say... Crysis 3 (or heck even the second one). But if I go in and say - considering the game's age - it still looks good today, then I don't compare it to a sequel that would have been released 2, 3 or 4 years later with obvious 'better', more complex visuals from an updated engine that ran the original game, or even beyond that with new industry standards (for example: when the sequel happens to be from a newer generation of hardware / consoles).

So I approach this like so; considering the game's age, it "still looks good" probably because it was ahead of its time back when it was released (which in and of itself can still be debated, it's subjective):

(Also trying to stay 'on PC' here, otherwise this could be a very long list if I included consoles and started from my 8-bit years)

There's one particular 'moment' in the early 2000s that I believe marked the arrival of new technology rapidly that suddenly made everything considerably more complex than pretty much anything before (then again yes, that sort of situation happened numerous times in the industry; especially when we started to move to full 3D gaming, but that's splitting hairs for this thread). And that would be 2004 with the following games:

Half-Life 2

Base game, no Episodes, back in 2004 it was groundbreaking mostly due to the game's physics, but visually was also very good with only the first Far Cry and DOOM 3 to compete with. To this very day, I still very clearly remember the Half-Life 2 demonstration at E3 2003. It blew my mind and remained the one and only such 'tech demo' for an upcoming video game to have had that effect on me all the way until I saw Cyberpunk 2077's E3 2018 45 minutes-long demo. For no less than 15 years no other demonstrated games shown at E3 or previewed anywhere else floored me as much. Obviously, the closest one to have had at least some "wow" factor was for Crysis prior to the game's release, but not at the level of Half-Life 2's demo; not a chance.


That, my friends, was in 2003. Let that sink in.

Far Cry

Basically, Crysis before Crysis to some extent, it was ahead of its time. The 'tech demo' made for this one (X-Isle: Dinosaur Island) was made to show capabilities of NVIDIA's GeForce 3 series (dating back to 2001).


It's not that easy so many years later to fully grasp what it would have truly felt like to watch something like that back in 2001. I for one never saw this particular tech demo until some years later I believe. However, having been a PC gamer myself since the Pentium 4 and Diablo 2 years I can fully understand what it might have been like seeing something like that for what the 'future of gaming' could provide. It was ahead of its time.

DOOM 3

Arguably the benchmark for visuals back in 2004, and remained as such for a few years after. It was (mostly) the lightning and shadows that really took everyone by surprise (for looking that good, in a video game). The character models and animations as well were pretty impressive (and texturing, and lightning on textures, with shaders effects and so on). Similarly to Half-Life 2, DOOM 3 was preparing new tech for the industry to carve its place in, and showed its first demos around the 2002 or 2003 years (at E3). That stuff at the time was very impressive.

For retrospective... looking at this (especially starting at the 2:58 mark when it goes in first person), not knowing it was from 2002 I wouldn't have said it was from 2002 indeed. Point being, it was ahead of its time by a couple of years easily (especially in comparison to pretty much anything that was made prior to it, until the others the same year came out and competed with this as well).


It all lasted pretty much until Crysis came out. But as you pointed in your post, about how Crysis 'doesn't count'; because it's the obvious answer to this subject for many here. Even if one wouldn't have enjoyed the game per se, it's nearly impossible to deny what it meant for the industry to have such a game that was pushing things so hard it took not one but two graphics cards generations to finally be able to run the damn thing at high resolutions, maxed out, with anti-aliasing (that one was the big deal at the time) and at a smooth, fast frame rate (then again... it was never very well optimized).

With this said, back to Half-Life 2, Far Cry and DOOM 3. There's a correlation here with all three of them. They were released the same year, and they were all developed with new and unique technology. Suddenly, in one year, the industry had not one but three new engines. Namely, CryEngine, Source and id Tech 4. Each one of them respectively offering new, never-before-seen visual and physics effects. I know that the term "modern gaming" can be interpreted differently and we all have our own definition of it. For some, "modern games" merely means 'any game that came out in the past 2 or 3 years max'. For others, it's 'any 3D game that came out this year'. For some, it's 'when DX9 came out', or 'when we moved from 2D to 3D', or 'when Call of Duty became a series' (not really). Heck, some would say that modern gaming is when DLCs became a thing. I mean it's a very broad term. I mean what would be "old school gaming" today considering 2D platformers and Indie games using very oldschool game-making methods are making a come back in recent years, would those still be considering 'modern' just because they've been released recently? It's up for grabs in how you want to see it.

However, for me, "modern gaming" started in 2004. Since that year, as far as I can remember, there's been no other single year during which so many advances came in to that extent from three different engines. In my view, it was a very pivotal moment in gaming history where I can put my finger on the clear cut seam between actual "old" video games and "modern video games". But then again, it's my personal view on it.

And to come back to the thread's main subject, that's why I believe that to this day those three games mentioned here still 'look good', because it's about as far back in time I can go to without too much exaggeration and say "it still looks good" even when put side by side with games today. That doesn't mean that "still looks good" equals "beats everything today", please understand, lol. It simply means that I don't have trouble looking at those games and feeling sick trying to force myself thinking that they still look good when 'in fact' they'd look terribly aged. The irony is that MANY games released after 2004 have aged very badly even though those three games in particular still hold their ground. In particular DOOM 3 I'd say is probably the one that aged the best of the three, while FarCry looks perhaps a bit too 'plastic' today (especially human characters, enemies, their skin textures and the wet shader effects making them look like moving wet action figures), and Half-Life 2 simply has very low resolution textures by now (but mods can fix that very easily without even changing character models).

I could of course mention many others, most of which would be considerably older; or a decent bunch released a bit after the first Crysis came out. But for the sake of simplicity and not typing a novel on this I'll keep it at the three 'main' games I mentioned.
 
Last edited:
  • Like
Reactions: Red Hawk

GodisanAtheist

Platinum Member
Nov 16, 2006
2,161
543
136
Isn't VSR just Super Sampling? Not sure how it or DSR (Nvidia's version of game agnostic SSAA) are driver tricks.

Source games hold up really well over time. Dark Messiah looked remarkably good (honestly it would look ok next to something like Skyrim that released 5 years later) thanks to how source handled lighting and shadows.

Rage is another game that was ahead of it's time visually. Released in 2010, could easily hold its own against titles released in 2015 or 2016. Incredible looking game that runs super slick on modern hardware.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,237
144
106
Specifically on PC? So console ports we're talking about? :)

Well, there's a good bunch to be honest, it's also quite a subjective thing. It may depend on how you approach this, for example I could say something like Battlefield 2 still looks good today, but technologically-speaking there's no way it "looks better" (or has more engine update shenanigans) than Battlefield 3. The same with Crysis 1 compared to say... Crysis 3 (or heck even the second one). But if I go in and say - considering the game's age - it still looks good today, then I don't compare it to a sequel that would have been released 2, 3 or 4 years later with obvious 'better', more complex visuals from an updated engine that ran the original game, or even beyond that with new industry standards (for example: when the sequel happens to be from a newer generation of hardware / consoles).

So I approach this like so; considering the game's age, it "still looks good" probably because it was ahead of its time back when it was released (which in and of itself can still be debated, it's subjective):

(Also trying to stay 'on PC' here, otherwise this could be a very long list if I included consoles and started from my 8-bit years)

There's one particular 'moment' in the early 2000s that I believe marked the arrival of new technology rapidly that suddenly made everything considerably more complex than pretty much anything before (then again yes, that sort of situation happened numerous times in the industry; especially when we started to move to full 3D gaming, but that's splitting hairs for this thread). And that would be 2004 with the following games:

Half-Life 2

Base game, no Episodes, back in 2004 it was groundbreaking mostly due to the game's physics, but visually was also very good with only the first Far Cry and DOOM 3 to compete with. To this very day, I still very clearly remember the Half-Life 2 demonstration at E3 2003. It blew my mind and remained the one and only such 'tech demo' for an upcoming video game to have had that effect on me all the way until I saw Cyberpunk 2077's E3 2018 45 minutes-long demo. For no less than 15 years no other demonstrated games shown at E3 or previewed anywhere else floored me as much. Obviously, the closest one to have had at least some "wow" factor was for Crysis prior to the game's release, but not at the level of Half-Life 2's demo; not a chance.


That, my friends, was in 2003. Let that sink in.

Far Cry

Basically, Crysis before Crysis to some extent, it was ahead of its time. The 'tech demo' made for this one (X-Isle: Dinosaur Island) was made to show capabilities of NVIDIA's GeForce 3 series (dating back to 2001).


It's not that easy so many years later to fully grasp what it would have truly felt like to watch something like that back in 2001. I for one never saw this particular tech demo until some years later I believe. However, having been a PC gamer myself since the Pentium 4 and Diablo 2 years I can fully understand what it might have been like seeing something like that for what the 'future of gaming' could provide. It was ahead of its time.

DOOM 3

Arguably the benchmark for visuals back in 2004, and remained as such for a few years after. It was (mostly) the lightning and shadows that really took everyone by surprise (for looking that good, in a video game). The character models and animations as well were pretty impressive (and texturing, and lightning on textures, with shaders effects and so on). Similarly to Half-Life 2, DOOM 3 was preparing new tech for the industry to carve its place in, and showed its first demos around the 2002 or 2003 years (at E3). That stuff at the time was very impressive.

For retrospective... looking at this (especially starting at the 2:58 mark when it goes in first person), not knowing it was from 2002 I wouldn't have said it was from 2002 indeed. Point being, it was ahead of its time by a couple of years easily (especially in comparison to pretty much anything that was made prior to it, until the others the same year came out and competed with this as well).


It all lasted pretty much until Crysis came out. But as you pointed in your post, about how Crysis 'doesn't count'; because it's the obvious answer to this subject for many here. Even if one wouldn't have enjoyed the game per se, it's nearly impossible to deny what it meant for the industry to have such a game that was pushing things so hard it took not one but two graphics cards generations to finally be able to run the damn thing at high resolutions, maxed out, with anti-aliasing (that one was the big deal at the time) and at a smooth, fast frame rate (then again... it was never very well optimized).

With this said, back to Half-Life 2, Far Cry and DOOM 3. There's a correlation here with all three of them. They were released the same year, and they were all developed with new and unique technology. Suddenly, in one year, the industry had not one but three new engines. Namely, CryEngine, Source and id Tech 4. Each one of them respectively offering new, never-before-seen visual and physics effects. I know that the term "modern gaming" can be interpreted differently and we all have our own definition of it. For some, "modern games" merely means 'any game that came out in the past 2 or 3 years max'. For others, it's 'any 3D game that came out this year'. For some, it's 'when DX9 came out', or 'when we moved from 2D to 3D', or 'when Call of Duty became a series' (not really). Heck, some would say that modern gaming is when DLCs became a thing. I mean it's a very broad term. I mean what would be "old school gaming" today considering 2D platformers and Indie games using very oldschool game-making methods are making a come back in recent years, would those still be considering 'modern' just because they've been released recently? It's up for grabs in how you want to see it.

However, for me, "modern gaming" started in 2004. Since that year, as far as I can remember, there's been no other single year during which so many advances came in to that extent from three different engines. In my view, it was a very pivotal moment in gaming history where I can put my finger on the clear cut seam between actual "old" video games and "modern video games". But then again, it's my personal view on it.

And to come back to the thread's main subject, that's why I believe that to this day those three games mentioned here still 'look good', because it's about as far back in time I can go to without too much exaggeration and say "it still looks good" even when put side by side with games today. That doesn't mean that "still looks good" equals "beats everything today", please understand, lol. It simply means that I don't have trouble looking at those games and feeling sick trying to force myself thinking that they still look good when 'in fact' they'd look terribly aged. The irony is that MANY games released after 2004 have aged very badly even though those three games in particular still hold their ground. In particular DOOM 3 I'd say is probably the one that aged the best of the three, while FarCry looks perhaps a bit too 'plastic' today (especially human characters, enemies, their skin textures and the wet shader effects making them look like moving wet action figures), and Half-Life 2 simply has very low resolution textures by now (but mods can fix that very easily without even changing character models).

I could of course mention many others, most of which would be considerably older; or a decent bunch released a bit after the first Crysis came out. But for the sake of simplicity and not typing a novel on this I'll keep it at the three 'main' games I mentioned.
Well, as PC and console development have converged, each console generation has sort of served as the baseline for what to expect, and any games that stand out really do so on the basis of having features that only PCs support and go beyond what's available on consoles.

What I find interesting about your picks is that they sort of come from the start of that era. In the 90s, lots of 3D games, particularly FPSes or "Doom/Quake clones" as they were called, were developed with the latest PC technology as the target. They were sometimes ported to consoles, with significant downgrades. 3D console games were sort of their own thing that rarely came over to PC. This all started changing in large part because of the original Xbox sort of bridging the gap between console and PC developers, and also middleware like Unreal supporting both PC and consoles and thus making the porting process easier. Doom 3, Far Cry, and Half Life 2 were sort of the last hurrah of major games that were developed exclusively for PC, and had to wait to be ported to the next console generation. So it's no surprise that they stand out as ahead of their time. The only real significant games since then that have targeted PC are Crysis and The Witcher 2 -- both advanced for their time, though even they managed to be ported to their contemporary console generation, and each continued as a multiplatform series from that point.

And of course I don't mean that any of these games actually look as good or better than games coming out right now, just games that were ahead of their time and hold better than other games of the time. What I meant by "modern Tomb Raider games" is the rebooted Tomb Raider series that started with the 2013 game and continued with Rise of the Tomb Raider and Shadow of the Tomb Raider, as opposed to the old Tomb Raider games from before that. With regard to "modern gaming" in a broad sense, I would again trace that to the point when PC and console development really started converging. Which as it happens, is right around 2004.

Isn't VSR just Super Sampling? Not sure how it or DSR (Nvidia's version of game agnostic SSAA) are driver tricks.

Source games hold up really well over time. Dark Messiah looked remarkably good (honestly it would look ok next to something like Skyrim that released 5 years later) thanks to how source handled lighting and shadows.

Rage is another game that was ahead of it's time visually. Released in 2010, could easily hold its own against titles released in 2015 or 2016. Incredible looking game that runs super slick on modern hardware.
VSR is a form of supersampling, yes. But VSR accomplishes that supersampling by "tricking" games into seeing higher available resolutions than the actual resolution of the monitor being used. You can select a higher resolution in the game's settings, then the driver downsamples the resulting image to the monitor's actual resolution. This can cause nuisances in some games with the UI shrinking down as it would on a 4K monitor when it's really still just on a 1080p monitor. The in-game supersampling in Tomb Raider renders the native 3D image before the UI is added, downsamples it to the set monitor resolution, then applies the UI on top. So it's a bit more seamless in that regard than VSR, when it's available.
 

BFG10K

Lifer
Aug 14, 2000
21,621
454
126
Tomb Raider 2013 looks beautiful overall but its super-sampling is a blurry abomination that's clearly implemented incorrectly.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,237
144
106
Tomb Raider 2013 looks beautiful overall but its super-sampling is a blurry abomination that's clearly implemented incorrectly.
I do remember seeing criticism of the game's implementation of SSAA when it first came out. It looks fine to me though, do you have any side by side comparisons that illustrate what you're talking about?
 

BudAshes

Lifer
Jul 20, 2003
12,235
1,159
126
It's too bad I find tomb raider games utterly excruciating to actually play.
 

BFG10K

Lifer
Aug 14, 2000
21,621
454
126
I do remember seeing criticism of the game's implementation of SSAA when it first came out. It looks fine to me though, do you have any side by side comparisons that illustrate what you're talking about?
You don't even need screenshots. Just switch it on to see the immediate blur-fest, or switch it off to immediately see a clearer image. Try any daylight scene to make it even more obvious.
 

Mai72

Diamond Member
Sep 12, 2012
9,803
774
126
Yea. Played Tomb Rader a month ago when it was free on Steam.

I thought the game was only 1-2 years old. It looks that good IMO. The gameplay was really good as well.
 

Stuka87

Diamond Member
Dec 10, 2010
4,768
501
126
Its really best to just use DSR for Tomb Raider 2013. I play at 1080P currently, so just set the game up to 1440P and you end up with a nice sharp image. The game is old enough that even mid range cards easily run it.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,237
144
106
So I finished Tomb Raider and started Rise of the Tomb Raider (happened to be on sale on Steam right now for under $10). And it looks fantastic too. Takes everything Tomb Raider 2013 did and goes further with it.

One thing I think is really cool is that Rise of the Tomb Raider was an early adopter of DirectX 12. There were some growing pains, like performance actually degrading on Radeon cards vs DX11 until asynchronous compute was patched in later on. But that was all ironed out, and now playing the game on my 5700 XT in DX12 consistently benchmarks several FPS ahead of DX11. That's much appreciated, especially since I still like to crank up the SSAA, which puts it right on the 60 FPS borderline. Now, raw frame rates can be deceptive -- I've recently played Battlefront 2, which has high reported framerates in DX12 mode, but frametimes are an unmitigated stuttery disaster. But frametimes seem to be butter smooth in ROTTR's DX12 renderer when monitoring them with MSI Afterburner.

People may have ragged on ROTTR at the time but it was all a necessary stepping stone in marching their technology forward. Apparently the next game Shadow of the Tomb Raider recommends that players use the DirectX 12 renderer. And of course DX12 is going to be essential on next gen consoles and for implementing ray tracing. SOTTR is one of the first games to relegate DX11 to being included for legacy GPU support.

Now if only they would release a version of ROTTR without Denuvo...
 
Last edited:

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
21,711
1,668
136
Thanks man, this one is still sitting in my backlog. Never gotten around to playing it.
 

ASK THE COMMUNITY