• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[gamegpu.ru] Tomb Raider Benchmarks GPU / CPU

Grooveriding

Diamond Member
Benchmarks are starting to come in, here are the results over at Gamegpu.ru.

http://gamegpu.ru/action-/-fps-/-tps/tomb-raider-test-gpu.html

TR%20fxaa%201920.jpg


TR%20fxaa%202560.jpg




TR%20fxaa%201920%20tressfx.jpg


TR%20fxaa%202560%20tressfx.jpg



TR%20ssaa%201920%20tressfx.jpg


TR%20ssaa%202560%20tressfx.jpg




TR%20proz.jpg



Looks like nvidia has some driver work ahead for them. Hopefully they have this sorted by the time I have my Titan system finished.
 
Last edited:
looks like it well run good on most budget cards.

I have no interest in this game though and I got it for free with my video card.

All the tomb raider games ever release were terrible.
 
7870 is as fast as a 680 in this game, wow. Kinda strange to see the roll reversal here, Nvidia didn't have access to the final game code until late so their drivers are lagging on this title. I hope that will clue people in to the fact that the respective vendors are not necessarily falling down with driver support if a game doesn't run properly out of the gate.

That's the downside of these sponsorships unfortunately. On the plus side, no vendor check and lockout! I can almost guarantee that if this was an Nvidia title, TressFX would not be allowed to work on Radeon hardware, and the excuse would be that Nvidia can't provide support, Nvidia is not a charity, AMD should get going on their dev relations program. ^_^ I can only imagine the hype machine if TressFX was Nvidia tech, gotta give NV credit they know how to market themselves.

The game itself looks a heck of a lot of fun. People with Nvidia cards are saying that the latest betas are causing crashing issues so might want to revert until NV releases a fix.
 
Simple solution. Nvidia owners dont have to buy the game until its fixed/optimized. Not exactly going through the shakes in not buying it from day 1.
 
poor 680 getting beat down by the 7870


7970 is nearly double the speed of the 6970 while the ghz card is a good deal quicker yet. Yeah, this generation sucks...
 
Last edited:
Nvidia (ManuelG) responds:

We are aware of major performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.
In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.

https://forums.geforce.com/default/...e-a-look-at-tomb-raider/post/3752526/#3752526
 
Actually, the other result which jumps out is that in the CPU scaling the i3 once again lands near the bottom. Seems that 2 cores plus HT (even with excellent IPC) is no longer enough. Must look if someone benches a few modern games & engines with SB/IB Celerons and Pentiums. The FX4300/FX6300 (while power hungry) seem to perform fairly well in comparison in a lot of recent games and this case they're even running at stock.

But shouldn't CPU scaling results be run at a lower res? Because otherwise all those 100FPS+ results are just showing the limits of the 690 @ at those settings.
 
The stratification of game titles by vendor many of us were mildly worried about, has begun.
 
The stratification of game titles by vendor many of us were mildly worried about, has begun.

I would wait until NVIDIA has a chance to close a significant portion of the gap when they get time to release optimized drivers.

What I am more interested is if a few more Gaming Evolved releases like this will shift the stigma of "bad drivers" to NVIDIA. AMD has taken crap for years because they could not optimize TWIMTB titles until after their release, so their day 1 drivers had issues.

With the momentum shifting to Gaming Evolved will people be as negative towards NVIDIA as they were to AMD for the same issue?
 
Man, if you take out the Tahiti and Pitcairn chips these lists make sense. Dunno WTF happens when you put them back.
 
I wonder how much gaming evolved is helping their CPU's in these recent games. Very interesting results we are seeing here and in crysis 3.
 
I would wait until NVIDIA has a chance to close a significant portion of the gap when they get time to release optimized drivers.

What I am more interested is if a few more Gaming Evolved releases like this will shift the stigma of "bad drivers" to NVIDIA. AMD has taken crap for years because they could not optimize TWIMTB titles until after their release, so their day 1 drivers had issues.

With the momentum shifting to Gaming Evolved will people be as negative towards NVIDIA as they were to AMD for the same issue?

That's the point, more and more games will have launch issues with one brand or the other. Not good for the gaming consumer. Then there is the issue of introducing features that will make it difficult for the other brand to optimize, invisible Crysis 2 ocean Edit: more comparable to a PhysX feature and now Lara Croft hair (granted without vendor lock in).
 
Last edited:
That's the point, more and more games will have launch issues with one brand or the other.
I've been trying to say this for years, the polarization of PC gaming is worrisome. It's basically turning into, well this game is Nvidia certified, this one is AMD certified. If you don't have the right hardware, well too bad for you expect to be miserable trying to play the game.

On the plus side, at least this title WILL work with all the eye Candy on Nvidia hardware. AMD could have easily pulled an Nvidia and just disabled the L'OréalFX altogether on GeForce cards.
 
I don't get why the cpu load increases the more cores are enabled (with a few exceptions), Shouldn't it be the other way around?
 
They really should do the CPU tests at something like 1280x1024; at 1080p it is completely GPU limited. At a lower resolution we can see a true picture of what the CPUs perform like comparatively, I seriously doubt the AMD FX line would keep up.

Does Tomb Raider support hexa cores and hyperthreading?
 
Great showing from AMD, although it does seem to have hit a ceiling in the cpu tests. Oh well, 107fps+ still means the gpu is gonna be working the hardest in most cases.
 
Comparing at lower resolutions serves benchmark purposes only. It isn't relevant from a gaming standpoint.

Huh? Benchmarking the CPU at 1080p is worthless because it isn't a CPU test. It is a GPU test.

At a lower resolution it will no longer be GPU limited and therefore will actually be RELEVANT as a CPU test. You're not even testing the CPU at 1080p at high detail settings.
 
Really don't see why you'd benchmark below 1366×768 for gaming purposes and that would be mostly for mobile GPUs. That's the lowest resolution you would reasonably expect people with dedicated GPUs to care about. There are enough quirks with game engines that simply running at the lowest resolution possible doesn't provide much useful information.
 
Looks nice.

But the performance hit is way too big ... 20 fps..

2x TressFX @ 2560 x 1440 res of my monitor would kill the game play for me.


Perhaps its time to Cross Fire. : )


As for Nvidia, shit... they need to update drivers soon !!
 
Oh my goodness. I'm referring to the CPU benchmark test that this website performed: Running lower resolutions to eliminate the GPU as a factor in determing CPU performance is a normal practice. I really don't feel an FX CPU would be anywhere near a 3970X. Not even close. Yet this test makes them seem even by benchmarking the CPU at a GPU limited resolution...
 
Last edited:
Back
Top