Borderlands 2 GPU/CPU benchmarks [TechSpot/HardOCP/Others]

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
you were complaining about the performance hit of though werent you? why not just uses the SMAA injector? it looks better than FXAA and has really no performance hit that I can tell. heck I am averaging about 60 fps on max settings and high physx at 1080 with my wimpy system.

Yes, but I'll take the drops over aliasing every day of the week. I can't stand SMAA/FXAA/MLAA etc.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Using PhysX on High makes it more GPU limited especially using an NVIDIA GPU, and since this game uses Unreal Engine III it scales with more cores (ill say up to 4) and more cache.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Btw has anyone else noticed this:

When fps are low for no apparent reason (with PhysX high), for example when looking at a piece of cloth, and you alt-tab out of the game and go back in, fps are quite a bit higher?
 

WMD

Senior member
Apr 13, 2011
476
0
0
GCN isnt running very well yet, but different bench, different results.

oShFe.png


uFZSr.png


It's not a very demanding game at all for the GPU, a 5850 stock will run it great maxed 1080p. OC and its still 60 fps+

Good to know that. My 5850 running overclocked at 1680x1050 should still manage 70+fps. May have to postpone my upgrade until the 8xxx series.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Good to know that. My 5850 running overclocked at 1680x1050 should still manage 70+fps. May have to postpone my upgrade until the 8xxx series.

Don't forget that that's with the massively overclocked i7 3930k @4.8GHz - 6 core/12 threads. Every other processor below that and every clock speed decrement decreases the FPS significantly. I would venture a guess with a 5850 you're not running a 3930k @4.8. ;)
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Yes, but I'll take the drops over aliasing every day of the week. I can't stand SMAA/FXAA/MLAA etc.
I don't know what you could possibly see wrong with SMAA in Borderlands 2. it does a way better job than FXAA and I doubt your AA method looks in better in a way that is noticeable at all without an in depth screenshot analysis. so yeah i would rather play my game without framerate drops that are noticeable than to worry about some AA method that is not.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
FXAA and SMAA do little to combat aliasing in motion. Screenshots are unfit to properly judge AA methods. Just try it, force SGSSAA with the C1 bits and see for yourself.

I'll try SMAA now, but I know how that's gonna turn out ;)

Edit:
SMAA looks good in this game, but it doesn't completely smooth the finer lines and details.
But it looks very good together with 1.5x1.5 downsampling. In fact, it looks better than SGSSAA as the contour lines are better smoothed by the downsampling. It also performs a bit better.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
So the Nvidia cards are rendering AND running PhysX while the AMD equipped systems are only rendering and the CPU is running PhysX.
I think Railven is right. Techspot did a real shoddy job.

http://forums.anandtech.com/showpost.php?p=34000618&postcount=10

The worst part is that PhysX being CUDA exclusive for GPU acceleration is nothing new...it's been like that since NVIDIA took over AGEIA and will be like that for as long as CUDA is around...GNC can't suddenly start torun CUDA...
 

WMD

Senior member
Apr 13, 2011
476
0
0
Don't forget that that's with the massively overclocked i7 3930k @4.8GHz - 6 core/12 threads. Every other processor below that and every clock speed decrement decreases the FPS significantly. I would venture a guess with a 5850 you're not running a 3930k @4.8. ;)

No worries. I am running i5 2500K at 4.9ghz with 4GB of Gskill ripjaws at 2133mhz.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
The worst part is that PhysX being CUDA exclusive for GPU acceleration is nothing new...it's been like that since NVIDIA took over AGEIA and will be like that for as long as CUDA is around...GNC can't suddenly start torun CUDA...

Another thing I've noticed is that when I manually choose "CPU" for my PhysX choice, the in game PhysX level defaults to LOW and is unchangable. Sort of grayed out.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Another thing I've noticed is that when I manually choose "CPU" for my PhysX choice, the in game PhysX level defaults to LOW and is unchangable. Sort of grayed out.

Edit the *ini filles indicated in the Tech Spot article and you'll experience the glory of PhysX via CPU which translates into a HORRIBLE gaming experience, haha.

I really wish the author would acknowledge the comments left (seems I wasn't the only calling him out) and edit his article. People are going to reference it wrongly as I've already seen twice on these forums.
 

Makaveli

Diamond Member
Feb 8, 2002
4,947
1,533
136
Doesn't it make more sense to sell the 6950 and pick up a GTX670 in this case? $150 from the sale of the 6950 + $200 = almost a GTX670. This way you still get PhysX in BL2/Batman AC and faster performance elsewhere too.

That is a great idea but i'm kind of conflicted at the moment.

My intention was probably to pickup a 7970 card at some point in time.

I may still pick up a GTX650 card for something like $140 i'm just not sure where the performance level is with these nv cards and PhysX when using it as a secondary card.

Another thing to consider is I never use stock cooling my current card is whisper quiet decisions decisions.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Edit the *ini filles indicated in the Tech Spot article and you'll experience the glory of PhysX via CPU which translates into a HORRIBLE gaming experience, haha.

I really wish the author would acknowledge the comments left (seems I wasn't the only calling him out) and edit his article. People are going to reference it wrongly as I've already seen twice on these forums.

What happens, when setting PhysX to high and then settings it to CPU in the control panel (in the game it still says high, but greyed out as the other guy mentioned)? I got slowdowns to 35fps, which seem so me that there was something calculated on the CPU. With GPU PhysX, I got 45fps.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
What happens, when setting PhysX to high and then settings it to CPU in the control panel (in the game it still says high, but greyed out as the other guy mentioned)? I got slowdowns to 35fps, which seem so me that there was something calculated on the CPU. With GPU PhysX, I got 45fps.

From my understanding, half of the PhysX effects are run on the CPU, such as debris and cloth. I can run PhysX High on my modded system before I got the mod to work and had 60 FPS fine UNTIL I got shot and bled on the floor and slowly my FPS started to crawl.

The liquid physx effect is very taxing. Go shoot up a few goons, or if you have a gel based weapon, report your FPS then. I went from 60 FPS (vsynch on) to a the lowest 22 FPS.

Fixed mod, with GTX 460, lowest I went down to was 55 FPS. Also, I don't use insane resolution or AA as I've seen you use.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
What happens, when setting PhysX to high and then settings it to CPU in the control panel (in the game it still says high, but greyed out as the other guy mentioned)? I got slowdowns to 35fps, which seem so me that there was something calculated on the CPU. With GPU PhysX, I got 45fps.

Some features dosn't have a CPU path anymore, so you will miss out on the effects that requires the most computation.
At least that was how it was the last time I played with the PhysX SDK...but it has been a while.

Tearable cloth can drag any CPU to it knees eg.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
So I bought a GTX670 just for this game :) It's the ASUS DirectCU non-TOP version and I absolutely love the card!! My previous Toxic HD5850 was no slouch but this thing is just awesome :D

Borderlands 2 runs all maxed (PhysX high) at 1080p usually 50-60FPS. I cap it at 60FPS, no point in having the card produce more anyway. However there are some places where the FPS drops to under 30 (I've seen as low as 25...), especially when there's a nice vista (like the very first "city" you find). My CPU runs at 3.2GHz but I guess the Core2 architecture on my Q9450 is showing its age here? When the game slows down this much, GPU usage usually doesn't go above 50%... I've seen it as low as 35% D:

However, 99% of time the game runs brilliantly - I really really like the debris and particles and all the other cool PhysX effects. Sure, they are eye candy only, but so are all the other settings too... AA, texture, LOD, etc.

I don't regret getting the GTX670 one bit :) It allowed me to enjoy Borderlands 2 more than I would with the Radeon. Plus all the other games run better too ofc (what a difference in Skyrim for example!).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Some guy at GameGpu disabled 4 cores of his Phenom II X6 1045T + GTX570 setup. He said with 6 Phenom II cores enabled, he was getting 50% utilization on his GTX570 but with 2 cores on the Phenom II, frames nearly doubled. Just a heads up for Phenom II X4/X6 owners as a possible fix to improve some of that CPU performance.

Looks like this game doesn't care at all for more than 2 cores. What it wants is lots of fast CPU cache, fast clock speeds and strong IPC per core. The reason i7 3930 overclocked is so fast has nothing to do with its 6 core - 12 thread specification but most likely its 12 MB cache. i7 3960X has 15 MB and should be even faster. Phenom X2 > X4/X6. Seems like this game is just another typical DX9 coded console port that doesn't take any advantage of modern multi-core CPUs (without PhysX).

b2%20proz%202.png
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,065
418
126
Some guy at GameGpu disabled 4 cores of his Phenom II X6 1045T + GTX570 setup. He said with 6 Phenom II cores enabled, he was getting 50% utilization on his GTX570 but with 2 cores on the Phenom II, frames nearly doubled. Just a heads up for Phenom II X4/X6 owners as a possible fix to improve some of that CPU performance.

Looks like this game doesn't care at all for more than 2 cores. What it wants is lots of fast CPU cache, fast clock speeds and strong IPC per core. The reason i7 3930 overclocked is so fast has nothing to do with its 6 core - 12 thread specification but most likely its 12 MB cache. i7 3960X has 15 MB and should be even faster. Phenom X2 > X4/X6. Seems like this game is just another typical DX9 coded console port that doesn't take any advantage of modern multi-core CPUs (without PhysX).

b2%20proz%202.png

techspot results for the PII X2 are far worse than X4/X6...

anyway, core i3 running the game (with physx high to)...
http://www.youtube.com/watch?v=WWjOgWnt5W8
 

Rhoxed

Golden Member
Jun 23, 2007
1,051
3
81
Interesting results.
For anyone interested I am playing this on my sig system
1090T @ 4.1ghz Triple 4850's @ stock
1920 x 1080 everything maxed (physx on low)
Avg: 64 - Min: 44 - Max: 88

Then I edited the .ini file and forced Physx to medium
Avg: 55 - Min: 36 - Max: 79

(didn't bother with High since medium is JUST playable)

Couple notes - CPU load and GPUs load both stayed the same after the .ini change. CPU @ ~28% and GPUs @ ~85%

Game loses crossfire after any cinematic or load screen (25 fps avg) a simple alt-tab would bring back the other 2 cards. After enabling physx through the .ini somehow fixed this issue, cards no longer drop after load screens.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Some guy at GameGpu disabled 4 cores of his Phenom II X6 1045T + GTX570 setup. He said with 6 Phenom II cores enabled, he was getting 50% utilization on his GTX570 but with 2 cores on the Phenom II, frames nearly doubled. Just a heads up for Phenom II X4/X6 owners as a possible fix to improve some of that CPU performance.

Looks like this game doesn't care at all for more than 2 cores. What it wants is lots of fast CPU cache, fast clock speeds and strong IPC per core. The reason i7 3930 overclocked is so fast has nothing to do with its 6 core - 12 thread specification but most likely its 12 MB cache. i7 3960X has 15 MB and should be even faster. Phenom X2 > X4/X6. Seems like this game is just another typical DX9 coded console port that doesn't take any advantage of modern multi-core CPUs (without PhysX).

b2%20proz%202.png


That is curious. Any idea why the other cores hurt performance? Were the threads jumping around for some unknown reason? I mean, the extra cores should just sit idle if not needed. Seems odd to me.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So the Nvidia cards are rendering AND running PhysX while the AMD equipped systems are only rendering and the CPU is running PhysX at it's lowest setting.
I think Railven is right. Techspot did a real shoddy job.

http://forums.anandtech.com/showpost.php?p=34000618&postcount=10

EDIT: bold above

It's expected that a 336 SP GTX460 should be faster for PhysX than a quad-core i7 CPU. That's not the real story here. The real story is BL2 was supposed to be the flagship game for NV this fall but it ended up very CPU limited. A 2012 modern FPS game that scales almost linearly with CPU clock speed, especially on the Intel Core i side, and only uses 2 cores. In fact on the GPU side, you can get 60 fps on a $180 GTX560Ti and on the Kepler side a $230 GTX660 with PhysX at 1080P is even faster. In other words, in 3-6 months from now when more GPU intensive games launch, no one is going to care how fast a GTX680 ran BL2 because a $230 GTX660 mid-range card could do it without any trouble. ^_^ Most people don't care if a game runs at 80 vs. 70 fps but it matters a lot more when one GPU chugs at 40 vs. 50 fps in a GPU intensive game. That's where you really need the extra grunt.

Really, it's games like these that make people delay GPU upgrades since they can easily play them on much older and slower videocards (which is great since it saves them $). It's not NV's fault of course (and it doesn't make BL2 a bad game in any way!) but the fact that the graphics have hardly improved from BL1, a 2009 game, and that you can play this game rather easily on a GTX560Ti is a sad state of affairs due to the 7-year-old current console generation cycle.

I am still waiting for another new game to join the list of Crysis 1 / Warhead, BF3, Witcher 2 EE and Metro 2033 and help to elevate PC graphics to another level. BL2, while a highly rated/great game, is not one of those games that is getting us closer to next generation PC graphics.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
It's expected that a 336 SP GTX460 should be faster for PhysX than a quad-core i7 CPU. That's not the real story here.

But you said this in another thread:

Not even close. It appears that a modern Intel CPU can handle the entire PhysX of BL2 with a Radeon HD7970 dedicated to graphics. You get the same performance as having a GTX680 doing graphics + PhysX.

The review was wrong and you've yet to address your mistake. As someone pushing cards, you need to address false information when found.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Yeah, the review has people actually thinking that the PhysX level is set to High (it's really locked to LOW setting and this is the CPU default for BL2) and the CPU is running it with a 7970 just as fast as a GTX680 doing everything (rendering and PhysX) set to high.

Turns out, and anybody please correct me if I am mistaken, that the GTX680 rendering and PhysX set to HIGH was actually equal to or outperforming a 7970 with CPU running PhysX set to LOW.

Do I have this right? Because those doing the .ini hack as Railven suggested are reporting tanking fps when PhysX is set to high and the CPU has to run it.

And LOL at "The real story".