Bit-tech bench Tomb Raider Legends

Jun 14, 2003
10,442
0
0
this is a TWIWMTBP game, and well its the usual story lol.....Ati do better.

http://www.bit-tech.net/gaming/2006/04/11/tomb_raider_legend_review/1.html

why does this always happen? i think pretty much every TWIWMTBP game runs better on ATI, and then the small selection of ATI branded games run just as good if not better on NV?

"So what's going on with NVIDIA you ask? Quite honestly, a balls-up of substantial proportions. The game is part of NVIDIA's The Way It's Meant To Be Played (TWIMTBP) scheme, which normally means some Green engineers have combed through the graphics code to ensure smooth running on NVIDIA-based graphics cards. The TWIMTBP logo is on the box, and we're subjected to the familiar animated "NVIDIA" whisper every time we load the game. However, the drama goes deeper.

When installing the game, users are presented with a splash screen, which says:

"The Tomb Raider: Legend development team and NVIDIA have worked closely together to provide a state-of-the-art gaming experience on the PC platform. Tomb Raider: Legend has the ability to use next-generation GPU features such as Direct X Shader Model 2 and 3, Hardware Shadow Support, and XHD Rendering at resolutions of up to 2560x1600. Eidos recommends having the latest NVIDIA GeForce graphics processor in your PC to enjoy Tomb Raider: Legend with all features enabled."

Pictured are two screenshots, showing Next Generation Content enabled and disabled. Naturally, the next-gen screenshots looks far better and users are left with the impression that they will have a better gaming experience on NVIDIA hardware. Imagine our surprise when, having enabled Next Generation Content, the game suffers from routine pauses not just during action-packed sequences, but basic running and turning manoeuvres.

No matter what we did, we could not cure the problem. We tried multiple driver revisions. We dropped the resolution to a paltry 800x600. When even the insane grunt of GeForce 7900 GTX SLI failed to improve things, it was clear NVIDIA has driver problems with this game. For any random game, this would be an annoying and you'd wait for a patch or fix. Since Legend is a TWIMTBP game, this performance issue is inexcusable, but given the splash screen extolling the virtues of playing the game on NVIDIA hardware (at up to 2560x1600 no less) this problem is downright reprehensible."



UPDATE

84.43 driver is available on nzone here

supposedly adds SLI profile for TR:Legends, and addresses some of the stuttering problems
 
Mar 19, 2003
18,289
2
71
They couldn't even get the 7800GT playable with the next gen content? I guess I don't feel so bad that my 6800GT was owned then :p

I've been wanting an X1900XT anyway...just gotta come up with the money for it now ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
It's just another example of the total joke TWIMTBP is and illustrates nVidia's lacking drivers yet again.

If you play games that aren't actively benchmarked (like this one) and/or at settings that aren't benchmarked nVidia's driver has a tendency to horribly fall over.

ATi's drivers are far more consistent and just work well in far more games, whether or not they're benchmarked and regardless of the settings you run them at.

What's even worse is that nVidia releases official drivers only about once every four months so problems can take years to get fixed, if they ever are.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
How does bit-tech know that nVidia's poor performance with "next generation" enabled is driver related or that it will even be resolved?

Also why is it they always use an OC model to compare against a base XTX? Makes their gameplay evaluations that much more difficult to figure out considering the cheapest BFG 7900 GTX OC I could find on Froogle is $540: http://www.tankguys.biz/geforce-7900gtx-512mb-p-1558.html vs. $450 for the cheapest XTX available at Newegg. It would make much more sense if they put two cards with close to the same market value against each other (7900 GTX vs XTX) rather than a much costlier OC model.
 

Shortass

Senior member
May 13, 2004
908
0
76
Originally posted by: BFG10K
It's just another example of the total joke TWIMTBP is and illustrates nVidia's lacking drivers yet again.

If you play games that aren't actively benchmarked (like this one) and/or at settings that aren't benchmarked nVidia's driver has a tendency to horribly fall over.

ATi's drivers are far more consistent and just work well in far more games, whether or not they're benchmarked and regardless of the settings you run them at.

What's even worse is that nVidia releases official drivers only about once every four months so problems can take years to get fixed, if they ever are.

It isn't like ATI's had a clean slate concerning driver problems, though they have been rather good as of late (though on games like Doom III it took a while for the playing ground to be more even).

Still, this is kind of silly. My 9800 Pro feels older than death now that I'm playing Oblivion, and while just a few months ago I thought ATI had been smoked it seems that NVidia's showings are far less appealing to me. When I upgrade this summer, unless things have radically changed (again) I'll certainly go for the x1900xt. Sorry NVidia, you lose this round again. Maybe in 4 years.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Chalk one more up for ATi and next gen support (this and their nice Oblivion patch for AA and HDR).
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
How does bit-tech know that nVidia's poor performance with "next generation" enabled is driver related or that it will even be resolved?

That is a good point actually.

We know they have the speed: with Next Generation Content disabled, the GeForce 7900 GTX is some 25% faster than the Radeon X1900XTX - a delta no doubt a direct result of NVIDIA's Developer Relations team. And once they release a new driver with proper Legend support, we should see the Next-Gen speed where it should be. We will update these results when NVIDIA does so.

That statement has all kinds of contradiction going on... First NV gains 25% without Next-Gen content because of their relationship, and then they don't have driver support for Next-Gen... Interesting speculation from bit-tech with nothing to back it up.
 
Jun 14, 2003
10,442
0
0
Originally posted by: 5150Joker
How does bit-tech know that nVidia's poor performance with "next generation" enabled is driver related or that it will even be resolved?

Also why is it they always use an OC model to compare against a base XTX? Makes their gameplay evaluations that much more difficult to figure out considering the cheapest BFG 7900 GTX OC I could find on Froogle is $540: http://www.tankguys.biz/geforce-7900gtx-512mb-p-1558.html vs. $450 for the cheapest XTX available at Newegg. It would make much more sense if they put two cards with close to the same market value against each other (7900 GTX vs XTX) rather than a much costlier OC model.


i think they said that they even went down to 800x600 with SLI 7900GTX's and it did the same thing. that says to me that its not the cards not having the grunt, jus the driver is ****** up somewhere.

for a TWIWMTBP game, you'd think the drivers were kosher

plus they even say they dont test apples to apples.

they just take some cards, do some tests, and tell you what they think is the best settings for each card. not always the best method i agree, but their way definately compliments other sites reviews. sites like Xbit tend to just crunch the numbers, bit tech give you some gameplay feedback....which is nice

they dont set out to find a winner, they set out to tell people how, in their opinion, each card they have used plays the game
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Originally posted by: otispunkmeyer
Originally posted by: 5150Joker
How does bit-tech know that nVidia's poor performance with "next generation" enabled is driver related or that it will even be resolved?

Also why is it they always use an OC model to compare against a base XTX? Makes their gameplay evaluations that much more difficult to figure out considering the cheapest BFG 7900 GTX OC I could find on Froogle is $540: http://www.tankguys.biz/geforce-7900gtx-512mb-p-1558.html vs. $450 for the cheapest XTX available at Newegg. It would make much more sense if they put two cards with close to the same market value against each other (7900 GTX vs XTX) rather than a much costlier OC model.


i think they said that they even went down to 800x600 with SLI 7900GTX's and it did the same thing. that says to me that its not the cards not having the grunt, jus the driver is ****** up somewhere.

for a TWIWMTBP game, you'd think the drivers were kosher

plus they even say they dont test apples to apples.

they just take some cards, do some tests, and tell you what they think is the best settings for each card. not always the best method i agree, but their way definately compliments other sites reviews. sites like Xbit tend to just crunch the numbers, bit tech give you some gameplay feedback....which is nice

they dont set out to find a winner, they set out to tell people how, in their opinion, each card they have used plays the game
Yeah, at 800x600 you're not telling me that the 7900GTX can't pump out decent rates with the next gen content enabled. Driver troubles is definitely indicated. Still pretty shoddy though.

I'm wondering what resolution/AA level I'm gonna have to drop to with my 7800GT to get the next gen stuff on. (Assuming this IS driver related and they fix it.)
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: nitromullet
How does bit-tech know that nVidia's poor performance with "next generation" enabled is driver related or that it will even be resolved?

That is a good point actually.

We know they have the speed: with Next Generation Content disabled, the GeForce 7900 GTX is some 25% faster than the Radeon X1900XTX - a delta no doubt a direct result of NVIDIA's Developer Relations team. And once they release a new driver with proper Legend support, we should see the Next-Gen speed where it should be. We will update these results when NVIDIA does so.

That statement has all kinds of contradiction going on... First NV gains 25% without Next-Gen content because of their relationship, and then they don't have driver support for Next-Gen... Interesting speculation from bit-tech with nothing to back it up.

Yeah this could be ATI's pixel shading power on the X1900 series finally giving them the advantage.

And to substantiate that claim, someone should throw an X1800 into the mix to see just how much faster the X1900 is.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Maybe the shader load/complexity is killing the 7900's, or maybe someone from the developers forgot not to use dynamic branching in a TWIMTBP game. Maybe even the NV driver team got an IF statement backwards, being too distracted by um... the game.

At any rate, you can expect some kind of fix, especially if this game shows up as a benchmark in a number of sites. I'd be curious to know what's issue with Nv cards in this game.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Theres actually a new patch thats been released for the game that apparently nearly doubles or more fps with Nvidia cards.
Someone above is right, it's some sort of driver conflict error.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Don't to crap on the thread...

"A quick note about the Xbox 360 version of Tomb Raider: Legend. In short, it looks stunning. It includes all of the above Next Generation Content and arguably looks slightly better somehow. We had no problems running at 720p HD (1280x720) and found the gamepad controls to be a more natural way of throwing Lara around the screen than keyboard and mouse.

The Xbox 360 version of the game may cost £49.99 compared with £29.99 for the PC edition but it is obvious that no PC costing "only" £279.99 could possibly hope to run the game at the same level of detail or speed."

Also,

Playing at 1024x768 or 1280x1024 with high detail level and even anti-aliasing should be within the grasp of most people. However, crank up the Next Generation Content and you will need to invest a four-figure sum in a PC to play it at a decent resolution and framerate.

So much for consoles suck argument.

I personally think that in shader intensive games ATI owns. TR: Legends is just another example.

So now we have ATI > Nvidia in:

Far Cry, BF2, COD2, FEAR, Oblivion, TR: Legends. Another repeat of X850XT > 6800U. Add AA+HDR support and high Q AF, and X1900XT is looking better and better.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
This is sad for nvidia. BTW, that game looks like crap without the "next gen content." Interesting note about the xbox360. I'd certainly pick this game up for the xbox 360 than a PC.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
err I just got the game and played it with my xbox 360 USB controller. didn't like Keyboard Control :! Pretty good game i must say but i finished it in 7hrs : )

Its short but the best tomb raider game i must say : )

 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: RussianSensation
Don't to crap on the thread...

"A quick note about the Xbox 360 version of Tomb Raider: Legend. In short, it looks stunning. It includes all of the above Next Generation Content and arguably looks slightly better somehow. We had no problems running at 720p HD (1280x720) and found the gamepad controls to be a more natural way of throwing Lara around the screen than keyboard and mouse.

The Xbox 360 version of the game may cost £49.99 compared with £29.99 for the PC edition but it is obvious that no PC costing "only" £279.99 could possibly hope to run the game at the same level of detail or speed."

Also,

Playing at 1024x768 or 1280x1024 with high detail level and even anti-aliasing should be within the grasp of most people. However, crank up the Next Generation Content and you will need to invest a four-figure sum in a PC to play it at a decent resolution and framerate.

So much for consoles suck argument.

I personally think that in shader intensive games ATI owns. TR: Legends is just another example.

So now we have ATI > Nvidia in:

Far Cry, BF2, COD2, FEAR, Oblivion, TR: Legends. Another repeat of X850XT > 6800U. Add AA+HDR support and high Q AF, and X1900XT is looking better and better.

It seems you forgot that you have to play the Xbox360 on a TV dont you.

And not just some crappy 15" screen but why not an HDTV so you can enjoy the high resolutions that PC users do. Also after doing that try factoring in a reasonably priced one at maybe $2000-$3000!

Ive actually found that the 1.1 patch for TRL doesnt do crap all. My 7800GTX can barely play it at 1680x1050 widescreen with everything turned on. Its actually doing far worse than Oblivion and thats far better looking!

Im also getting some crappy audio problems with clicking, stuttering, hissing and what not. And from what ive read around the web its happening to everyone who has Creative cards... which means... well quite a lot of ppl!

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Far Cry, BF2, COD2, FEAR, Oblivion, TR: Legends.

All but one of those are XB360 games....hmmmm. Until the PS3 is near expect all next gen content to be ATi otimized- particularly for the R580 parts as those are what devs have been working with until relatively recently(the second and third gen 360 titles will benefit the R600 more).

And not just some crappy 15" screen but why not an HDTV so you can enjoy the high resolutions that PC users do. Also after doing that try factoring in a reasonably priced one at maybe $2000-$3000!

They aren't including the price of the monitor in their PC specs, and HDTVs can be used for other things. Also, my widescreen 30" HDTV cost me less then $900. Sure, the 50" SXRD that is going in my living room is $3.5K but that is an awful lot less then a 50" monitor :)

Maybe the shader load/complexity is killing the 7900's,

@800x600 running SLI? No. If you look at the average framerate the 7900 is faster @1600x1200 then the x1900xt is @1900x1200- clearly reducing the resolution by 75% nV SHOULD be screaming fast- there is clearly a driver bug.

It's just another example of the total joke TWIMTBP is and illustrates nVidia's lacking drivers yet again.

If you play games that aren't actively benchmarked (like this one) and/or at settings that aren't benchmarked nVidia's driver has a tendency to horribly fall over.

Nothing at all like GITG. Take a look at how ATi was throttling nVidia in B&W2..... well...... after you waited a couple of months for the game to be patched twice and new drivers anyway :p
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Ben are you sure ATI "needs" XB360 games support to show its relevant superiority in shader intensive games?..(and still we haven't seen serious dynamic branching implementation..)
Dunno about the TR:Legend it's probably a driver issue..
But Nvidia seems to be back in the vast majority of d3d9 games and this is not a new phenomenon to anyone.. It goes back a long time ago..
I don't agree with the driver part that many mention, I believe both companies have equally "good" or "bad" drivers nowadays..
It's sad though to see a game being advertised as TWIMTBP and in its launch lagging behind..
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Ben are you sure ATI "needs" XB360 games support to show its relevant superiority in shader intensive games?.

Latest FEAR numbers. Worst case scenario for nV is the 7900GTX loses out to the 1900xtx by 2FPS. That is a stock clocked GTX, not one of the OC models. The only other shader intensive game that ATi has a real edge in that isn't on the 360 is SC:CT- 4FPS difference at most. Much like B&W2 for ATi, nVidia had drive issues that were holding their parts back that everyone that pulls for team red wants to forget about.

In terms of hardware implementation the 7900GTX and x1900xtx have nigh equal shader power, branching would be the only area where ATi may take a big lead(and even then it depends on exactly how it is implemented). ATi's huge edge in shader performance is a myth at the hardware level. Anyone who really looked over the publicly available documentation has known this for quite a while. So I would say at this point it appear that yes, ATi does need the 360 to show its shader supremacy.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Nothing at all like GITG. Take a look at how ATi was throttling nVidia in B&W2..... well...... after you waited a couple of months for the game to be patched twice and new drivers anyway
Sure but again this seems to be the exception, not the norm. There are a lot of TWIMTBP games that run better on ATi even to this day.

As for B&W2, look at nVidia's latest driver readme, under known issues:

? GeForce 7800 GTX/GT, GeForce 6800, SLI: Water reflections are corrupted in Black & White 2.

In addition multiple other problems with B&W2 have been fixed, problems that likely existed when the initial batch of benchmarks were run:

? GeForce 7800 GT/GTX, SLI: Stuttering occurs with Black & White 2

Just how legit were nVidia's past numbers in this game?

I also notice there are quite a few examples of the driver falling over when 2560x1600 is used which again isn't a resolution that was benchmarked before.

Like I said before ATi's drivers are far more robust and stand up far better under a wide range of scenarios. nVidia's aggressive application detection and per-app optimization is coming back to bite them in the butt just like I argued it would in the past.

My X800XL had problems that could be counted on one hand but I need a driver list document to keep track of the issues my 7800GT has. The only reason I put up with it is because it's silent and because 16xAA looks gorgeous when you can use it.

Latest FEAR numbers.
Remember that most benchmarks are run under quality mode which can cause visible shimmering in certain situations. Running under high quality generally cures it but can cause performance to tank.
 

gi0rgi0

Golden Member
Dec 5, 2004
1,240
0
0
Im surprised the game looks better on the x360. Over on the 360 forums people with both high end pc's and 360's said the demo didnt look as good as
the pc demo. Like dripping water and stuff. Was that just cuz it was a demo then ?
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: otispunkmeyer

UPDATE

84.43 driver is available on nzone here

supposedly adds SLI profile for TR:Legends, and addresses some of the stuttering problems

You can make a profile, which Im sure the reviews did. Reports from another forum is that they didnt help frames at all. Which stinks for NV users. Good thing is.. the game isnt very long, and the stuttering is claimed to be fixed. But not the overall low frames. But bi-tech is supposed to re-review it, so we'll see. Hopefully it brings performance up.

edit, update from bi-tech;
http://www.bit-tech.net/news/2006/04/12/nvidia_fixes_tomb_raider_legend/