[gamegpu.ru] Tomb Raider Benchmarks GPU / CPU

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Sorry can't see the link, Chrome or one of my extensions must think it is evil.
 

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
Oh my goodness. I'm referring to the CPU benchmark test that this website performed: Running lower resolutions to eliminate the GPU as a factor in determing CPU performance is a normal practice. I really don't feel an FX CPU would be anywhere near a 3970X. Not even close. Yet this test makes them seem even by benchmarking the CPU at a GPU limited resolution...

FX has done pretty well in recent games that can use the threads, but overall I don't think it's a big deal. I haven't heard anything about how this game is threaded, but a 690 gave up at just over 100fps, so I'm assuming the cpu's (at stock nontheless) still have a little power left in them. I mean the old phenom II's made it up there. I don't think the cpu will be the issue in 99% of cases.
 

Dravonic

Member
Feb 26, 2013
84
0
0
Oh my goodness. I'm referring to the CPU benchmark test that this website performed: Running lower resolutions to eliminate the GPU as a factor in determing CPU performance is a normal practice. I really don't feel an FX CPU would be anywhere near a 3970X. Not even close. Yet this test makes them seem even by benchmarking the CPU at a GPU limited resolution...

Yes, for benchmark purposes it makes sense. But again, from a gaming standpoint, what is the use of knowing a 3970X will give you 400fps at 640x480 and an FX-8350 will give you 300? Since it's useless information why should they even bother to gather it?

It makes sense to benchmark them at real world resolutions to find out if any of them is a bottleneck. That is relevant to gaming.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Oh my goodness. I'm referring to the CPU benchmark test that this website performed: Running lower resolutions to eliminate the GPU as a factor in determing CPU performance is a normal practice. I really don't feel an FX CPU would be anywhere near a 3970X. Not even close. Yet this test makes them seem even by benchmarking the CPU at a GPU limited resolution...

Now seriously. What's the point of superlow res benches? It's like synthetic benches or even some built-in game benches.

I'd like tech sites being honest about stuff like this. I won't run my games at 64x WTFAA or couple my TNT2 with a 1440p monitor.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
The inference is supposed to be, if said CPU gives you 400fps vs 300fps on a different CPU in non GPU limited scenarios, then in some other CPU limited game you might get better performance. But I find this type of thing basically useless, why not just bench the game at resolutions people play, isn't that what actually matters?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I'm actually starting to think a 8320 instead of haswell might be a good idea, people are saying AVX2 won't matter in games but I thought it would... I wish I knew :(
 

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
I'm actually starting to think a 8320 instead of haswell might be a good idea, people are saying AVX2 won't matter in games but I thought it would... I wish I knew :(

Sadly probably won't know for another year :(. Games like this and Crysis are making me happy about getting a 8320, but I was so disappointed most of last year with my decision to get a AM3+ mobo early on. Now it's finally getting to be worth something. They're all great processors, but I'm sure Intel doesn't want to let this slide. Worth waiting and watching in my opinion.
 

Souv

Member
Nov 7, 2012
125
0
0
STUPENDOUS job amd.....gcn is balanced between both gaming and compute power for gaming and scientific works and fp64,bitcoin mining,etc etc.......whereas kepler is crippled in compute power...hell even fermi is better in compute than kepler.......

another good compute ex will be hitman absolution- http://www.techspot.com/review/608-/page2.html

THIS GAME IS A MUST PLAY-GAME.....TOMB RAIDER IS GETTING GREAT REVIEWS FROM EVERYWHERE (UNLIKE RECENT RELEASED CRYSIS 3 WHICH HAS GOT MIXED REVIEWS).........THIS GAME IS A "MUST BUY GAME" FOR GAMERS......
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Nvidia's performance in part will be improved through drivers, and in part through a developer patch:

NVIDIA GPU Performance Issues in the new Tomb Raider
The new Tomb Raider reboot was released last night, and has so far been praised as a good game that brings back that nostalgia of the old Tomb Raider games. However, in our testing today, we have become aware of performance issues on NVIDIA GPUs while playing this game. We reached out to NVIDIA for answers and some information for you if you are experiencing slower than expected performance, it is not just you. We asked NVIDIA what the best driver for the game is right now, and NVIDIA's response is as follows:

"Hi Brent. You can try using the latest beta driver however, please note that we are aware of major performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn't receive final code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible. In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games."

We then followed up asking about an ETA for the new driver, and got this response:

"Hi Brent, this isn't solely a driver issue. The developer needs to make some changes on their end to fix their issues on GeForce GPUs. This will require code changes to the game."

So there you go, it is not only going to take a driver update, but also a patch update for the game to get the most out of NVIDIA GeForce GPUs. I myself am experiencing slow performance, (R.E. 20's FPS) with GTX 680 SLI at maxed out in-game settings, with FXAA. There is definitely a performance issue that needs resolving. The only thing you can do right now, if you are experiencing slow performance, is turn TressFX off, back to Normal, and try lowering in-game setting, starting with SSAO, to try and improve performance. We will have a full evaluation of this game, and its performance and IQ next week with tips on how to get the most out of it performance wise.

http://www.hardocp.com/
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I've heard that using the older 310 driver works or turning off tessellation stops it from crashing.

It also works fine on the Fermi series from what I heard and I'd bet that it probably works fine on GTX Titan.

I had thought this was going to be a DX9 game because wikipedia once said it used the same engine as Underworld.

In any event, I think it is good to see that one vendor doesn't play it as well as the other because that means AMD hardware is more unique from nv hardware than I had thought.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
FX has done pretty well in recent games that can use the threads, but overall I don't think it's a big deal.

yeah... i wonder if PS4 and Xbox-720, both with 8 cores, actually made developers uses more cores
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think it is good to see that one vendor doesn't play it as well as the other

Huh? If you're talking performance differences, let's face it - that will happen sometimes. However with regard to stability which is apparently a big issue - I kinda think that developers should look out for PC gamers as a whole. For their customers it shouldn't be a matter of what hardware they have, they should have a stable gameplay experience regardless of what brand GPU they have.

Now performance differences will happen. I'm referring to crashes and stability issues, I don't for a second think that is good for anyone. Square Enix should look out for their customers as a whole with extensive play testing IMO.....the fact that all GTX 600 cards are crashing periodically in the game, I find that to be an incredible failing on the part of Square Enix. This should not happen. Regardless of which GPU brand sponsors the title!
 
Last edited:

MBrown

Diamond Member
Jul 5, 2001
5,726
35
91
I'm not sure how they were able to get 65fps at 1080p with a 7950 on VHQ (I'm assuming they mean ultra settings, but I sure cant. I get around 50 to 60. Its not that much of a difference but I like to run vsync so I turn some of the settings down.

What settings are you guys running this game at?
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Hope I like it, this looks like a worthwhile game. 2013 starting off pretty strong. BioShock Infinite next up.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
I've been trying to say this for years, the polarization of PC gaming is worrisome. It's basically turning into, well this game is Nvidia certified, this one is AMD certified. If you don't have the right hardware, well too bad for you expect to be miserable trying to play the game.

On the plus side, at least this title WILL work with all the eye Candy on Nvidia hardware. AMD could have easily pulled an Nvidia and just disabled the L'OréalFX altogether on GeForce cards.

Yeah, as much as I like how The Way It's Meant To Be Played and now Gaming Evolved get cutting edge effects into games, I can't really condone a situation where one brand is clearly at a purposeful disadvantage. By either program.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I'm actually starting to think a 8320 instead of haswell might be a good idea

Not so sure about that. I think Intel will still lead the way but at the same time AMD will improve the most. The delta won't be as high as in the last few years.
AMD finally realised how important it is to have the gaming developers behind them.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Now seriously. What's the point of superlow res benches?

We all know that this method is used to address cpu performance but i agree that we need all sites to start testing real world scenarios. I don't game @ 640x480, i game at high def res so it would be much more beneficial for me to have access to benchmarks with similar res to mine.
 
Feb 19, 2009
10,457
10
76
I like the hair FX, but have to see how much perf % loss it takes to enable.. SSAA is also overkill. Some features just hammer GPUs way too much for the minor gains.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I would wait until NVIDIA has a chance to close a significant portion of the gap when they get time to release optimized drivers.

What I am more interested is if a few more Gaming Evolved releases like this will shift the stigma of "bad drivers" to NVIDIA. AMD has taken crap for years because they could not optimize TWIMTB titles until after their release, so their day 1 drivers had issues.

With the momentum shifting to Gaming Evolved will people be as negative towards NVIDIA as they were to AMD for the same issue?

No. It will be AMD's fault. nVidia marketing will make sure of that.
 

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
Square Enix should look out for their customers as a whole with extensive play testing IMO.....the fact that all GTX 600 cards are crashing periodically in the game, I find that to be an incredible failing on the part of Square Enix. This should not happen. Regardless of which GPU brand sponsors the title!

Of course if what Anarchist420 mentioned is true (that the older 310 drivers don't crash) then it is always possible that the latest drivers are causing the crashes and instability. Maybe NV rushed those through but didn't test them that well, or some other variable is at play with those who are getting crashes and those who aren't (that is if not everyone is getting consistent crashes).

Point being that while it's always possible that the developer didn't do a good job, there are other possibilities.

As regards CPU performance, my concern isn't whether i5K/i7K are better vs FX83x0 but rather if FX4300/FX6300 now makes sense versus i3. Intel has the high end pretty much to itself but I'm always more interested in low to med range. The halo effect of i5/i7 should mean that everyone automatically recommends i3 or below over all AMD alternatives.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Hm, quite different results from PCGH:
http://www.pcgameshardware.de/AMD-R...mb-Raider-PC-Grafikkarten-Benchmarks-1058878/

8gbNQ3U.jpg

WCBbe3V.jpg

ipYcoO1.jpg

iLMuUZh.jpg


The integrated benchmark is not really representative of gameplay performance. The 680 holds its own against the 7970 GE without TressFX, against the 7970 with TressFX but when you enable 4xOGSSAA, AMD can put their higher GFLOPs to use and pull ahead accordingly.

Titan is 20% faster than the 7970 GE, but clocks were fixed at 876 MHz for Titan. Let it boost to 1006 and it's 40% faster (not shown).
 
Feb 19, 2009
10,457
10
76
So according to PCGH, NV has no problems with Tomb Raider, so they won't have to blame the game's poor code (requiring a patch) or their lack of driver support because its a GE title. All is fine?