Borderlands 2 GPU/CPU benchmarks [TechSpot/HardOCP/Others]

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I can tell you haven't really even looked this subject up. Since I've been working on it trying to get it to work on my end I can answer some of these questions for you:

1. If the PhysX was impressive and demanding as NV marketed it, a CPU shouldn't not have been able to run it;

A CPU can't run it fully. You keep gaming this general broad statement. A CPU can only run the flags and debris at acceptable frame rates (but....see below).

2. If the CPU can run PhysX without an NV GPU (based on the benchmarks linked in this thread), then NV probably asked the developer to block this feature so gamers who want PhysX think that they must go out and buy an NV GPU.

Not all users had to even modify their *.ini files. I am one of them. Before I even reinstalled my second GPU for offloading I was able to set the PhysX to high WITHOUT MODIFICATION. That's how I realized how devastating it is to a non-GeForce system. You can find posters on other forums stating they didn't have to modify the *.ini to get the option to change the setting, but then you'll see them asking why isn't offloading working properly.

Anyways, here are the PhysX Low vs. High screenshots.

PhysX Low is basically no physics.

None of those screen shots are even showing fluid, which is part of Medium/High settings and those are the features that cost the most performance. Gearbox even addressed this and said a patch will fix the fluid performance issue.

And now you can see why a Radeon + Modern CPU can run PhysX on Low, because there is NOTHING to run. Set it to medium and fluid/debris/cloths will start to bog the system down.

And you claim you aren't turning this into an anti-GeForce subject? As a Pro-AMD user, you sure aren't convincing me.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Hey rail which char you picked up for your first play through?I am yet to get my hands on it(I am yet to receive it actually :( ) and playing TL2 in the meantime.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Okay, so I went through the trouble of taking some screen shots during ACTUAL game play, not staring at objects or just shooting a wall. I also disabled PhysX Offloading, so these screen shots are all Radeon 7970 (1125/1575) + Intel i5 2500K @ 4.4ghz

PhysX Low:
jrqq1.jpg

100+ FPS, no blood, no debris, no flags.

PhsyX Medium:
uVDL0.jpg

70+FPS, visible blood, debris, flags

PhysX High:
NxnbY.jpg

50+ FPS, even more blood + chunks, debris, flags

Note: this isn't even a heavy action scene. This is just me walking into the exact same spot I took my other screen shots and killed the first mob(s) I could and taking a screen shot before I got killed.

Now, imagine more dead bodies, more debris, more little pools of blood, and you can sort of get an idea of how crippling it can get.

The first major boss sits in a tank and fires his canon at you which creates a lot of debris. I mean - A LOT OF DEBRIS.

@Jaydip:
I'm playing a Commando, sort of hate that they got rid of the healing tree since that's how my GF and I played Co-Op for part 1. Otherwise, we're both enjoying the game.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Seems like crippling a CPU is a really great way to showcase how good your video card is, if it can handle the CPU-crippling load with ease.

Seems like a smart move for NVidia to allow everyone to run PhysX, but those without monster CPUs will truly understand how they need to go buy an NVidia card ASAP.

But is it a specific feature of NVidia cards that lets them do PhysX so well, or is it generally something that is handled well by any GPU, but cripples current CPUs?
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Seems like crippling a CPU is a really great way to showcase how good your video card is, if it can handle the CPU-crippling load with ease.

Seems like a smart move for NVidia to allow everyone to run PhysX, but those without monster CPUs will truly understand how they need to go buy an NVidia card ASAP.

But is it a specific feature of NVidia cards that lets them do PhysX so well, or is it generally something that is handled well by any GPU, but cripples current CPUs?

I have no idea how well AMD GPUs would handle PhysX, but the HD7900's have a lot more general compute power than the 670 or 680.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I have no idea how well AMD GPUs would handle PhysX, but the HD7900's have a lot more general compute power than the 670 or 680.

That's because the 680/670 chip (GK104) was originally intended to be a midrange part.

The big Kepler chip, GK110, is apparently a real compute monster on paper, but it goes to pro graphics and HPC first, maybe not even reaching GeForce ever.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Yeah, the review has people actually thinking that the PhysX level is set to High (it's really locked to LOW setting and this is the CPU default for BL2) and the CPU is running it with a 7970 just as fast as a GTX680 doing everything (rendering and PhysX) set to high.

Turns out, and anybody please correct me if I am mistaken, that the GTX680 rendering and PhysX set to HIGH was actually equal to or outperforming a 7970 with CPU running PhysX set to LOW.

Do I have this right? Because those doing the .ini hack as Railven suggested are reporting tanking fps when PhysX is set to high and the CPU has to run it.

And LOL at "The real story".

No...the REAL story here is that even if PhysX has been out since 2006..the people who oppose PhysX cannot tell the difference between PhysX low(CPU) and PhysX high (GPU)....but they still think they have a valid point.

They have had 6 years...and they still fail....on an EPIC scale.

Why would anyone listen to them?
It's obvious now that their stance is one of ignorance....not facts.

Kinda like the public rep PhysX has gotten...due to these ignorants...makes you wonder...if they actually kan tell MSSA from FXAA...or that also is an example of ignornace (like using still photo to jugde AA)...I am entertained and saddend on the same time.

This thread has delivered! :cool:
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
PhysX was supposed to enhance realism in games. PhysX in this game has not at all improved on previous implementations of this technology on NV's part. This is why this game runs well on a GTX660Ti + PhysX. It's because there is hardly any worthwhile PhysX in it. So again, it didn't live up to the hype graphically, and specifically the PhysX is nothing special vs. the hype that was built up for how amazing BL2 will be due to PhysX. It's like Mafia 2 all over again. Had the massive NV marketing hype not preceded this game's launch, it would have been fine. NV overhyped the PhysX, but where are these revolutionary enhancements? Besides exaggerated pebbles/rocks/flying sparks and cloth physics, there isn't any new revolutionary physics implementation here.

Sparks and debris being affected by forces, such as explosions in a Newtonianly accurate manner.
Particles colliding with surfaces and rebounding instead of clipping through.

In which game(s) have you seen anything of the above?

Doesn't this mean gamers can get away with PhysX Low/Medium without having to spend any $ on an NV GPU for PhysX? As Silverforce11 noted, this is great! NV should just drop the marketing gimmick and make PhysX work on the CPU. It'll still run faster on NV GPUs but I feel a lot more developers would be using PhysX if it ran on a CPU.

PhysX already runs on CPU.

But guess who are the biggest consumers of GPGPU? That's right - physicists.
Physical models and involved calculations are by the very nature massively parallel, ie. CPU physics is inferior to begin with.

Your points make my head spin:
  • PhysX is a marketing gimmick and Nvidia failed in BL2, although everyone went nuts on BL2 PhysX, myself included writing walls of text here how AMD users are just fine.
  • NV purposely hobbles CPU PhysX, so it can undercut AMD, although it runs well enough on CPU. WOOT?
 
Last edited:

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
now that BL2 runs great on my aging E8400/5850 (both overclocked), i have no reason to upgrade my computer for 6 months at least, until Haswell is released.

i set all details on high, AFx16, turned off AO and DoF, 1920x1080, MLAA (set in CCC) frames capped at 120 and i usually get around 80.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
now that BL2 runs great on my aging E8400/5850 (both overclocked), i have no reason to upgrade my computer for 6 months at least, until Haswell is released.

i set all details on high, AFx16, turned off AO and DoF, 1920x1080, MLAA (set in CCC) frames capped at 120 and i usually get around 80.
according to benchmarks, neither your cpu nor gpu is capable of that at 1920 unless AO and DoF were really that demanding. I still see no way possible you can do that with an E8400 at 3.6 according to those cpu benchmarks from both sites that tested the game.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I'd like to see E8400 averaging 80fps too. Come on lets see it :)
I just tested and there is noway he he is getting that kind of performance in most of the game. I tested in a very low demanding part of the game using the settings he is claiming and its not possible with his cpu because even 2 cores of my 2500k at 4.4 were only getting around 80-85 fps just doing nothing but walking around. as soon as stuff happened it was in the 70s. 100 fps was not even possible except in closed in places. my cpu is almost 50% faster than his and my gpu was not the bottleneck because I even lowered the resolution to test it.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
now that BL2 runs great on my aging E8400/5850 (both overclocked), i have no reason to upgrade my computer for 6 months at least, until Haswell is released.

i set all details on high, AFx16, turned off AO and DoF, 1920x1080, MLAA (set in CCC) frames capped at 120 and i usually get around 80.

impressive, a lot better than on my i3 it seems,
can you take a screenshot on the same place?
jt2hxd.png


I left all on high (with view distance on ultra e foliage on far), disabled fxaa, ao and dof, fov 90
also I'm using a much lower res (720p), because of the slower VGA, but you can see the GPU usage far from full, meaning it's probably the i3 2100 on its limit?
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
This game looks like just like, uh, Borderlands 1, except now with even steeper hardware requirements. I wonder if anybody realize we are being played for fools by the GPU and gaming industry. Pay more $ to get the same eye candy, baby!
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
This game looks like just like, uh, Borderlands 1, except now with even steeper hardware requirements. I wonder if anybody realize we are being played for fools by the GPU and gaming industry. Pay more $ to get the same eye candy, baby!
I think the game looks better and some settings are of higher quality and some are also on by default this time. but yeah it must just be a big conspiracy by the gpu and game makers...
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Your points make my head spin:
  • PhysX is a marketing gimmick and Nvidia failed in BL2, although everyone went nuts on BL2 PhysX, myself included writing walls of text here how AMD users are just fine.
  • NV purposely hobbles CPU PhysX, so it can undercut AMD, although it runs well enough on CPU. WOOT?

He's (RS) created his own unique hyperbole and keeps evolving and repeating it days after the game goes live. Can't take it seriously.

I can't understand the reasoning around his gpu cost value assessments in regards to game quality, caliber of play.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
impressive, a lot better than on my i3 it seems,
can you take a screenshot on the same place?
[image]

I left all on high (with view distance on ultra e foliage on far), disabled fxaa, ao and dof, fov 90
also I'm using a much lower res (720p), because of the slower VGA, but you can see the GPU usage far from full, meaning it's probably the i3 2100 on its limit?

OMG the jaggies... D: (sure, graphics isn't everything but the black borders are really killing it IMO... just try FXAA :) ). Anyway, that spot is a killer for me too, 26-28 FPS... GPU usage at 30-35% on my GTX670 (all settings maxed at 1080p). I'm quite sure it's the CPU (Q9450@3.2GHz).

Rest of the game it's 50-60 FPS :) I upgraded from a Toxic HD5850@850MHz - I won't be upgrading my CPU until Haswell as most of the game runs brilliantly and it's only the very occasional vista that's slowing down.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
you can disable the black outline and you can use SMAA. no noticeable jaggies then and even better performance.
 

brandon888

Senior member
Jun 28, 2012
537
0
0
i sometimes got 40 fps dips on my i7 3770 and 670 ;/ sure with low usage like 40-50% ;/ this game is cpu demanding ... sad.... on crysis 2 or bf3 i got 95-99% gpu usage on my cpu ....
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
these are not the settings I'm using to play, I just disable a few things (to make it more comparable to what the e8400 user described, apart from aa and res)

I'm playing with the in game settings on "max" (physx low) and 1280x1024, it looks good on my screen,
anyway, if you want to see the performance on the area of the screenshots with these settings
http://www.youtube.com/watch?v=WWjOgWnt5W8

at 2:00 min (left video, with physx in low)
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I made some benchmarks in the Southern Shelf region. I ran to the first outpost on the way to the big cannon fight, started fraps and cleared the outpost, ran to the second outpost, cleared that and immediately after that I engaged the cannon and those two guys. I always shot at all the barrels, used grenades and opened the toilet where the goo comes out.

1920x1080, 4xSGSSAA via C1-bits
GTX580 SLI, i7-2600K@4GHz

VcxKC.png


The CPU has no chance, unplayable in heavy fights. A single 580 is not enough for SGSSAA, with SLI it's okay but there are fps dips in the fights when many PhysX effect are present.

Next I will benchmark in 1920x1080 without SGSSAA as that was a little unfair to the single 580.

1080p + SMAA:

Fpj5S.png


CPU gets killed, plain an simple. Doesn't really matter if you're using SLI or a single card + the other dedicated to PhysX. CPU bottleneck I presume.
I also benchmarked a single 580 that has to do both, rendering and PhysX.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
He's (RS) created his own unique hyperbole and keeps evolving and repeating it days after the game goes live. Can't take it seriously.

I can't understand the reasoning around his gpu cost value assessments in regards to game quality, caliber of play.

I find this extremely humorous. One year ago RS was constantly attacked by the ATI fans (I'll be completely honest, I had some arguments with him) and now all the nvidia fans are have animosity against him and outright attack him. /popcorn

Me? I'm just glad I realized that everyone (ATI, nvidia) can happily co-exist in the market and you don't need to have a completely polarized view. Video card wars are a big waste of energy expenditure o_O
 
Last edited: