Techspot benched tomb raider

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Interesting.
Ultimate_2560.png
 

ICDP

Senior member
Nov 15, 2012
707
0
0
Not really interesting. The new patch fixes the performance issues with Nvidia cards.

It is a great looking game though.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I also noted the DOF setting occurred the biggest hit also, in my playing. (gtx660)

That said, Nvidia has plenty of work to do on its drivers and AMD's new TressFX tech seems overly intensive.
When we played at 1920x1200 on ultimate, the Radeon HD 7970 GHz Edition sustained 53fps and that figure increased 47% when we disabled TressFX. Most gamers probably won't be able to justify that kind of performance hit. And it's worth noting that TressFX isn't the reason for Nvidia's poor showing -- that seems largely due to depth of field (DOF), which we discovered after a lengthy session of trial and error.
After testing each item, we found that changing DOF to normal while leaving everything else on ultra produced a 48% boost from 54fps to 80fps -- and that's with TressFX enabled as it is by default on ultra quality.
If we set the DOF level to high while on the ultra preset, the GTX 680 gave 47fps at 1920x1200, which is just 2fps less than the HD 7970 -- far more practical results
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
When we played at 1920x1200 on ultimate, the Radeon HD 7970 GHz Edition sustained 53fps and that figure increased 47% when we disabled TressFX. Most gamers probably won't be able to justify that kind of performance hit. And it's worth noting that TressFX isn't the reason for Nvidia's poor showing -- that seems largely due to depth of field (DOF), which we discovered after a lengthy session of trial and error.

The Nvidia cards only performed as expected when we used the high quality preset, and the changes when shifting down from ultra to high include texture quality, level of detail, depth of field and SSAO. After testing each item, we found that changing DOF to normal while leaving everything else on ultra produced a 48% boost from 54fps to 80fps -- and that's with TressFX enabled as it is by default on ultra quality.

If we set the DOF level to high while on the ultra preset, the GTX 680 gave 47fps at 1920x1200, which is just 2fps less than the HD 7970 -- far more practical results. Folks playing with an Nvidia card should definitely watch for a driver update that fixes this issue. Of note, texture quality, level of detail, SSAO and texture filtering had virtually no impact on performance while tessellation accounted for only 4fps.
I found this bit quite telling, is this how Nvidia's 256bit bus 2GB midrange is able to keep up with Tahiti, depth of field cheating..... erm optimization
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I also noted the DOF setting occurred the biggest hit also, in my playing.

Yes, not TressFX. Interesting as DOF is one of those settings that shouldn't make a huge difference in immersion turning it down a notch. They don't say whether or not it has the same effect on AMD cards, though.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Yes, not TressFX. Interesting as DOF is one of those settings that shouldn't make a huge difference in immersion turning it down a notch. They don't say whether or not it has the same effect on AMD cards, though.
Well in my original post, I stated I played with Hair on normal also. My card is just not that powerful.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Yeah I would expect some repairs from NV on this one.

Is the game good, being rated a TOP 5 of all time and all?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I found this bit quite telling, is this how Nvidia's 256bit bus 2GB midrange is able to keep up with Tahiti, depth of field cheating..... erm optimization

No. It's supposed to be the benchmark they're cheating at.

Well, anywhere they're faster they're cheating. That should cover it. :p
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
No. It's supposed to be the benchmark they're cheating at.

Well, anywhere they're faster they're cheating. That should cover it. :p

I think alongside all this smoothness, frametime testing and X% percentile, there should also be some IQ testing. Something stinks :sneaky:
 

zebrax2

Senior member
Nov 18, 2007
974
66
91
Their charts doesn't make sense on first glance. I was thinking "why does 34 exceed 38 and why does it go past the 50 line" then i realize that the numbers on the bottom indicate avg+min fps.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Last edited:

ICDP

Senior member
Nov 15, 2012
707
0
0
These benches were taken using an older version of the game. The new patch fixes issues with Nvidia cards. Pleas stop using irrelevant benchmarks to try and show Nvidia in a bad light. We need less trolling not more thanks.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
These benches were taken using an older version of the game. The new patch fixes issues with Nvidia cards. Pleas stop using irrelevant benchmarks to try and show Nvidia in a bad light. We need less trolling not more thanks.

If there is a patch that "fixes" the performance for nvidia, where are the benchmarks that show this? The article is dated today, how out of date could it possibly be?
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I also noted the DOF setting occurred the biggest hit also, in my playing.

I always turn DOF features off anyway. They aren't "realistic" in any way and just server to blur part of the screen. They're an attempt to introduce a major flaw in cameras in to a game which has no need to suffer from that flaw. Your eyes never encounter depth of field effects as they are used in games. Granted, you *do* experience depth of field as part of your normal vision, but it is *never* present in what you are focusing on, and given that all but the center of your vision is amazingly blurry anyway, we just don't need it in games. It would make sense in a VR head tracking type system where your eyes are always on the center of the screen, but when you're playing a game, and look off center and it's arbitrarily blurry, it's just stupid. If I were whatever the character is and moved my vision over there, it's not going to be blurry at all, my eyes would focus on the new location.

The only place these silly DOF effects make sense are when your character is looking through optics, like binoculars, a scope, a camera, etc.

It also might be ok with eye tracking that always shifted the focus of DOF to where you are looking at the screen, but again, I know of no one who plays games that only ever focuses dead center on the screen and never looks to the side at all. That's what they always simulate with "DoF".
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Many DOF implementations are indeed horrible. There is one notable exception though - I really like the in-game DOF effect in Witcher 2. Makes everything look very nice. That is probably the best DOF implementation i've seen, and it improves the appearance quite well.

As mentioned however, many DOF implementations suck - I haven't really looked at TR's implementation much. But the DOF in metro 2033 sticks in mind as being completely horrible. Made the game look significantly worse in wide open areas, and caused a huge drop in performance. In fact, I feel like the devs of metro 2033 had a giant check list of features to add to the game, which they did do but they spent very little time making them look good in-game. So many of the graphical effects in that game just...don't help.

I guess DOF is one of those features that look great when done well, and can make the game look worse with a poor implementation.
 
Last edited:

ICDP

Senior member
Nov 15, 2012
707
0
0
If there is a patch that "fixes" the performance for nvidia, where are the benchmarks that show this? The article is dated today, how out of date could it possibly be?

The article is dated today but it can take many days to conduct tests. This benchmark is taken using 1.0.716.5. The latest patch release is 1.0.718.4 and it fixes Nvidia performance and lockup issues.


Patch release notes.

We have just made public a new version of the PC version of Tomb Raider, build 1.0.718.4. This patch will be applied by Steam automatically when you next start the game. If your game does not update, please restart the Steam client.

This update addresses a variety of issues that we either found out about shortly before release or immediately after.

Fixes include:

- Addressed some stability and startup issues on machines that have both Intel and NVIDIA graphics hardware.
- Fix for players being unable to progress related to the boat in the beach area.
- Some fixes for crashes on startup and when selecting Options.
- Some small improvements to TressFX hair rendering.
- Fixes for various graphics glitches, including certain effects not being visible in fullscreen mode.
- Fixed a problem that caused some users to not be able to use exclusive fullscreen.
- Added support for separate mouse/gamepad inversion for aiming, as well as support for x-axis inversion.
- Fixes related to the benchmark scene and benchmark mode.
- Various other small fixes.

While we expect this patch to be an improvement for everyone. If you do have trouble with this patch and prefer to stay on the old version we made a Beta available on Steam, Build716.5, that can be used to switch back to the previous version. Please note however that you can only play multiplayer with people that share your version.

We expect further patches to follow fairly soon, addressing further issues we see being brought up by players, and are actively monitoring these and other forums looking for issues.
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
The article is dated today but it can take many days to conduct tests. The this benchmark is taken using 1.0.716.5. The latest patch release is 1.0.718.4 and it fixes Nvidia performance and lockup issues.

I get that is what you are saying, but I'd still like to see the new patch bench-marked.

Look the FX-8150 initial performance, and claims that some magical bulldozer patch would fix it's performance, patch came and the effective difference in performance was 1% or less.

I'm curious what the actual performance numbers look like after this supposed "fix".
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Wait, so these results are from 90 second rounds of real gameplay. So then why aren't the nvidia cards equal in this? I thought the vast difference in performance only occurred with the benchmark tool.

ICDP said:
The this benchmark is taken using 1.0.716.5

How do you know that. The game's version isn't mentioned in the article.
 
Last edited:

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
benchmark-

http://www.techspot.com/review/645-tomb-raider-performance/

(with avg.,,,minimum frame rate also included in the test)



(TR as one of top 5 games of all time-

http://www.metacritic.com/game/pc/tomb-raider)

15MIN / 34AVG on my GTX670 at 1080p Ultra + FXAA? Man, I must be getting old cause it feels a lot smoother for me... ;) That review looks totally off.

Also... top 5 all time? What? It has a score of 86... While that's high, sure, it's nowhere near the top...

http://www.metacritic.com/browse/games/release-date/available/pc/metascore?view=condensed

Not even the first page...
 

ICDP

Senior member
Nov 15, 2012
707
0
0
Wait, so these results are from 90 second rounds of real gameplay. So then why aren't the nvidia cards equal in this? I thought the vast difference in performance only occurred with the benchmark tool.



How do you know that. The game's version isn't mentioned in the article.

The version number used is in the actual charts.