• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

PCGHTomb Raider TressFX Benchmarks

csbin

Senior member
Feb 4, 2013
831
325
136
http://www.pcgameshardware.de/AMD-Radeon-Hardware-255597/Tests/Tomb-Raider-PC-Grafikkarten-Benchmarks-1058878/

Tomb Raider: TressFX in detail

The Tomb Raider developers use DirectCompute for real-time physics simulation of TressFX Hair. Each of the many strands of hair is treated as a chain with dozens of connections, so factors such as gravity, wind, or head movements realistic effects on Lara's hair. There is also a collision, thus overlap strands are not mutually penetrate or other solid surfaces such as head Lara, clothing or body. Hair swirled slowly go back again to its original state - just like in real life. Of course, all that computing power costs - because they are compute, therefore, the graphics chip is more heavily loaded. And this is TressFX out in action

From the main menu can be started, an integrated benchmark which maps a camera flight to Lara Croft and thus the advantages of AMD technology TressFX showcases. This is a worst case, since Lara's hair is depicted extensively. The frame rate is greatly impaired in this test of TressFX, the effects in the game are usually lower - in the numerous cut scenes, it sometimes happens, however, that a close-up of Lara has a Slowdown result. Let us look first at those emergencies in the form of integrated benchmarks. Because of this at this time is proving to be unsuitable - it come out different results with the same settings and the minimum value is incorrect - we can run with Fraps during the test:





Tomb Raider: "Real World" benchmarks with graphics cards

Since the built-in benchmark has little to do with the actual game, we introduce the practice benchmarks by a usual game scene. We decided spontaneously to a scene at the beginning of the adventure: Lara fights with the player through a cave and sees daylight for the first time. Said of us "Cliffs" outdoor scene has a relatively high visibility, particle effects and a brief close-up of Lara's hair, but it is played entirely by us and does not contain any cutscene. In "Cliffs" is not the worst case, but a typical, respectable gaming scene. The integrated benchmark is a similar scene from the way in those same rocks.



 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Since the built-in benchmark has little to do with the actual game, we introduce the practice benchmarks by a usual game scene. We decided spontaneously to a scene at the beginning of the adventure: Lara fights with the player through a cave and sees daylight for the first time. Said of us "Cliffs" outdoor scene has a relatively high visibility, particle effects and a brief close-up of Lara's hair, but it is played entirely by us and does not contain any cutscene. In "Cliffs" is not the worst case, but a typical, respectable gaming scene. The integrated benchmark is a similar scene from the way in those same rocks.
Oh, built-in benchmarks have little to do with the game now, is it? So is this going to become a thing now, where testers just run around in the game with no regard for synchronizing movement and time of movement for all cards, to ensure a fair testing premise? If only there was a method to do that and not have to worry about discrepancies between tests oh yeah there is, it's called a built-in benchmark.

Hell if they're not going to use that, at least make it so that the test does not have to depend on the tester's ability to play the sequence exactly as they did on the previous runs; Save in a spot while looking at a GFX intensive area, and use that for benchmarking, without touching the keyboard or mouse.
 

sontin

Diamond Member
Sep 12, 2011
3,120
33
91
Oh, built-in benchmarks have little to do with the game now, is it? So is this going to become a thing now, where testers just run around in the game with no regard for synchronizing movement and time of movement for all cards, to ensure a fair testing premise? If only there was a method to do that and not have to worry about discrepancies between tests oh yeah there is, it's called a built-in benchmark.

Hell if they're not going to use that, at least make it so that the test does not have to depend on the tester's ability to play the sequence exactly as they did on the previous runs; Save in a spot while looking at a GFX intensive area, and use that for benchmarking, without touching the keyboard or mouse.
Built-in benchmarks are synthetic. Using them to qualify the ingame performance makes no sense.

Tomb Raider is a good example:
AMD's graphics card are much faster in the built-in benchmark because they designed it in a way to screw the performance of the competition.

Here from the site pclab.pl:

http://pclab.pl/art52447.html

Even without TressFX the 7970GHz is much faster than the GTX680...

Really, that Tomb Raider built-in benchmark is one of the better cheats by AMD.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
Built-in benchmarks are synthetic. Using them to qualify the ingame performance makes no sense.

Tomb Raider is a good example:
AMD's graphics card are much faster in the built-in benchmark because they designed it in a way to screw the performance of the competition.

Here from the site pclab.pl:

http://pclab.pl/art52447.html

Even without TressFX the 7970GHz is much faster than the GTX680...

Really, that Tomb Raider built-in benchmark is one of the better cheats by AMD.
I don't know why you're trying so hard to convince us?

Andrew Burnes said:
We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn't receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.



Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved.



In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.
You need to tell this guy from nVidia that there's no performance issues.
 

sontin

Diamond Member
Sep 12, 2011
3,120
33
91
Nobody is talking about the problems with nVidia cards.

Maybe the guy from nVidia means the built-in benchmark with "performance issuses".
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
20
81
Some in-game benchmarks are indeed utterly stupid.I remember the same in HA where the camera rotates to give a full 360 degree view which you can't recreate in the game.I believe the in game benches should be more tied to the actual gameplay.Though In-game benches have one distinct advantage, it is repeatable.NV seems to suffer more than performance in this game it seems, sadly game crashing bugs shouldn't happen at all.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,560
40
91
Years ago some people were pointing that exclusive features, exclusive deals with game devs and whatnot, would just create a situation where one would have to:

a) have an AMD GPU and a NVIDIA GPU;
b) buy the GPU that had the games you wanted;
c) buy whatever GPU you wanted and then buy games of your GPU brand at launch and games of the other GPU brand later after the performance and problems are fixed.

Other people accused that first group of being jealous and fanboys and it was all in the name of a better PC gaming experience and that the other company should do the same.

Now the other company is playing the same game.

Unfortunately the first group was right.
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,600
1
76
Years ago some people were pointing that exclusive features, exclusive deals with game devs and whatnot, would just create a situation where one would have to:

a) have an AMD GPU and a NVIDIA GPU;
b) buy the GPU that had the games you wanted;
c) buy whatever GPU you wanted and then buy games of your GPU brand at launch and games of the other GPU brand later after the performance and problems are fixed.

Other people accused that first group of being jealous and fanboys and it was all in the name of a better PC gaming experience and that the other company should do the same.

Now the other company is playing the same game.

Unfortunately the first group was right.
Well good thing it isn't to the point where you have to have a card from both companies to play all the games coming out. It just hurts the ego of those who care about such things. Real gamers can just play the game like they always do.
 

badb0y

Diamond Member
Feb 22, 2010
4,002
25
91
Built-in benchmarks are synthetic. Using them to qualify the ingame performance makes no sense.

Tomb Raider is a good example:
AMD's graphics card are much faster in the built-in benchmark because they designed it in a way to screw the performance of the competition.

Here from the site pclab.pl:

http://pclab.pl/art52447.html

Even without TressFX the 7970GHz is much faster than the GTX680...

Really, that Tomb Raider built-in benchmark is one of the better cheats by AMD.
 

Xarick

Golden Member
May 17, 2006
1,199
0
76
How come with the benchmark AMD is a lot faster, but in game they are not. Something odd there.
 

Insomniator

Diamond Member
Oct 23, 2002
6,279
171
106
Regardless of who is faster, is nice hair worth a 28% drop in performance? What is the different between this technololgy and just 'manually' rendering hair nicer if the performance hit is so steep?

Its like they had a 'feature' that made dirt and grass act real on the ground but performance was cut by 4/5. Well yeah... in 5 years the same thing is naturally gonna happen it just won't be called some special feature.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
0
0
Years ago some people were pointing that exclusive features, exclusive deals with game devs and whatnot, would just create a situation where one would have to:

a) have an AMD GPU and a NVIDIA GPU;
b) buy the GPU that had the games you wanted;
c) buy whatever GPU you wanted and then buy games of your GPU brand at launch and games of the other GPU brand later after the performance and problems are fixed.

Other people accused that first group of being jealous and fanboys and it was all in the name of a better PC gaming experience and that the other company should do the same.

Now the other company is playing the same game.

Unfortunately the first group was right.

Let's make believe there was no developer work or resources spent from AMD in this title!

Possibly a direct port -- how wonderful! No DirectX 11 goodies for all -- no TressFX -- for all --- just a port.
 

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
FPS vary depending on how you play the game i.ei your position in the level and camera rotation speed/position. When you test games that way you're going to have glaring differences in results.

Since we judge video cards by tiny performance differences of a few FPS, "real world benching" is absolutely not a good option, unless it does not depend on player input like, for example, if you just load a save and start benching without moving around. Though, that would also be a sub-par method, since the average frame rates in those circumstances are unrepresentatively high and will not give you a fair impression on how the game usually runs.

So yeah ... pick your poison.
 

tviceman

Diamond Member
Mar 25, 2008
6,672
439
126
www.facebook.com
Titan!!!!

LOOKS SUCH A BARGAIN! ILL TAKE 3!!!
Did you not read the news over the past several days, with Nvidia saying that performance issues for Kepler and this game will be fixed via both drivers and game patches, or are you just trolling or making idiotically ill-formed opinions like always?
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,194
1
76
Did you not read the news over the past several days, with Nvidia saying that performance issues for Kepler and this game will be fixed via both drivers and game patches, or are you just trolling or making idiotically ill-formed opinions like always?
We'll see if it pans out, but considering how kepler based GPU's have performed in recent DX11 games with advanced DX11 features it could easily just be damage control.

I'm not saying they perform bad either, just that GCN seems to run the stuff better.
 

blackened23

Diamond Member
Jul 26, 2011
8,556
0
0
Did you not read the news over the past several days, with Nvidia saying that performance issues for Kepler and this game will be fixed via both drivers and game patches, or are you just trolling or making idiotically ill-formed opinions like always?
Agreed, I'm sure any performance issues with Kepler will be fixed. Dragon Age II had similar issues (in terms of performance anomalies on NV cards) at launch, and I won't get into my opinions of that game (not a good one) - but the performance issues were fixed shortly after. Tomb Raider is a big title and NV will not ignore it.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,128
0
71
*GCN seems to run recent deeply dev'ed Gaming Evolved Titles better than Kepler.

Let us not pretend the obvious isn't obvious.
 

sontin

Diamond Member
Sep 12, 2011
3,120
33
91
We'll see if it pans out, but considering how kepler based GPU's have performed in recent DX11 games with advanced DX11 features it could easily just be damage control.

I'm not saying they perform bad either, just that GCN seems to run the stuff better.
The only thing GCN runs better is the built-in benchmark. Ingame Kepler is on par, little bit slower or little bit faster - even with RegressFX.

BTW: A GTX680 is on par in Farcry 3 and Crysis 3 with the 7970GHz. :whiste:
 

notty22

Diamond Member
Jan 1, 2010
3,376
0
0
We'll see if it pans out, but considering how kepler based GPU's have performed in recent DX11 games with advanced DX11 features it could easily just be damage control.

I'm not saying they perform bad either, just that GCN seems to run the stuff better.
You mean, Crysis 3 and Battlefield 3, and they look pretty good in the Arma Alpha. Or do you mean Dirt Showdown, LOL
 

Midwayman

Diamond Member
Jan 28, 2000
5,384
177
106
Regardless of who is faster, is nice hair worth a 28% drop in performance? What is the different between this technololgy and just 'manually' rendering hair nicer if the performance hit is so steep?
Yes its worth it, at least if you can still get >30fps still. It looks a lot better in game. There is no difference between this and having the CPU do the same thing, its just less of a performance hit on the gpu. Or were you asking how its different than solid, couple bones at most hair?
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,194
1
76
The only thing GCN runs better is the built-in benchmark. Ingame Kepler is on par, little bit slower or little bit faster - even with RegressFX.

BTW: A GTX680 is on par in Farcry 3 and Crysis 3 with the 7970GHz. :whiste:
Good, it's more expensive.
 

ASK THE COMMUNITY