Borderlands 2 benchmarks (& Physx)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Lets say, hypothetically, if I were to add a 650Ti to my three 7970s and run hybrid PhysX, would the framerate still be at a constant 60FPS with PhysX set to high? Would a 650Ti keep up as a dedicated PhysX card?

You can be almost certain the moment you set PhysX to Medium or High in Borderlands 2, FPS will plummet below 60FPS no matter what your specs are.

I cannot speak for AMD Hybrid PhysX setups, but everyone with high-end/SLI Nvidia systems have framedrops below 60FPS in this game, when PhysX at medium or High.

However, I would be very interested in hearing your findings in a setup like that.

Smartest thing is to wait and see if Gearbox will actually release the rumored performance patch.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Golden Man, have you tried disabling SLI in the NV control panel and dedicating the 2nd card to physX. This will lower your overall game max fps, but might steady the high physx content moments. Which is a known outcome of high physX because of all the added visual fidelity.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Golden Man, have you tried disabling SLI in the NV control panel and dedicating the 2nd card to physX. This will lower your overall game max fps, but might steady the high physx content moments. Which is a known outcome of high physX because of all the added visual fidelity.

Already tried it. It got much worse. Also when lots of PhysX stuff going on.
 
Last edited:
Aug 30, 2012
73
0
0
Mine runs better with my 2ND 680 dedicated to physx, annoying really. I play with everything at max, 1200p, capped at 60. Physx on high, no fxaa though I use the Smaa injector for anti-aliasing.

Normally a solid 60fps with occasional drops to 45-50fps.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I cannot speak for AMD Hybrid PhysX setups, but everyone with high-end/SLI Nvidia systems have framedrops below 60FPS in this game, when PhysX at medium or High.

I can confirm similar (if not worse) drops for AMD Hybrid rigs. HD 7970 + GTX 460, and during high PhysX scenes performance drops to about ~45 FPS, and random spots of the game introduce a stuttering that makes the game unplayable (must be a bug due to the nature of Hybrid PhysX, same scene no issue on a GTX 680.)

I believe they were suppose to address the PhysX issues through a patch or something.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Actually, with my current settings. Everything maxed out at 1920x1200 except PhysX on Medium, Ambient Occlusion Disabled and View Distance at High it may be I got better FPS when disabling SLI and dedicating my second GTX 670 for PhysX.

Pretty strange, as I thought I had tested this.

I tested in the Caustic Caverns. Using Vsync on. It is pretty common seeing between 57 and 60FPS when I have Vsync on in this game. I fought 4 or 5 Crystal disks + some of those other badass creatures. Also using my turret. I don't think it actually dropped under 55FPS. If it was that low. I tried to keep track of the on screen FPS.

Maybe I should try using fraps and benchmarking this.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
It must be some kind of hardware/driver issue because with my single 670 my FPS stays at 60 fps 90% of the time. I have everything maxed with AO enabled, view distance is ultra high, and Physx is set to high.

I might have to test this through the caustic caverns. I am not sure If I am there yet.


Is SLI even scaling in this game at all? I am pretty sure it never worked in BL1.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
It must be some kind of hardware/driver issue because with my single 670 my FPS stays at 60 fps 90% of the time. I have everything maxed with AO enabled, view distance is ultra high, and Physx is set to high.

I might have to test this through the caustic caverns. I am not sure If I am there yet.


Is SLI even scaling in this game at all? I am pretty sure it never worked in BL1.

It's scaling a little bit. I get higher max FPS using SLI. But mostly it seems the game just distributes half the GPU usage between the cards. When in SLI they are running 30 - 60%. If second card is used for PhysX in addition to SLI you can add to that 20% for the second card.

I will do some more testing using the second card dedicated for PhysX. However, I don't see the problem using it for both SLI and PhysX, since they are nowhere near max GPU utilization.

Ironically, the highest GPU load you get is on the game menu, when Vsync is disabled and there are no FPS cap. Both cards can get to 90% + there.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
As I said in the other thread, you appear CPU limited at your settings/res.

If you were able to go higher res, you'd be back to GPU limited.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
As I said in the other thread, you appear CPU limited at your settings/res.

If you were able to go higher res, you'd be back to GPU limited.

You're probably right. But strange that when in game menu I get almost full GPU utilization on both cards.

Anyway, only thing I hope is that they fix this games PhysX performance.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
You're probably right. But strange that when in game menu I get almost full GPU utilization on both cards.

Anyway, only thing I hope is that they fix this games PhysX performance.


Actually, that is to be expected. That's a very, very CPU light scene. It's just the menu, so the cpu is napping, and the GPU just starts rendering as fast as it can. When you're playing, the CPU has to work, and the GPU spends time waiting on it.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
As I said in the other thread, you appear CPU limited at your settings/res.

If you were able to go higher res, you'd be back to GPU limited.


But the thing is... EVERYBODY is CPU limited in BL2.
Parts of PhysX code in certain situation are taking too much CPU resources/time, leaving rendering and other threads to wait for PhysX

Which part of that is Gearbox and which NV fault, your guess is good as mine.
I would not normally point finger at dev when dealing with PhysX, but it's obvious (DOF, AO) that Gearbox haven't been paying attention to always keeping the (CPU) resources in check.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Actually, with my current settings. Everything maxed out at 1920x1200 except PhysX on Medium, Ambient Occlusion Disabled and View Distance at High it may be I got better FPS when disabling SLI and dedicating my second GTX 670 for PhysX.

Pretty strange, as I thought I had tested this.

I tested in the Caustic Caverns. Using Vsync on. It is pretty common seeing between 57 and 60FPS when I have Vsync on in this game. I fought 4 or 5 Crystal disks + some of those other badass creatures. Also using my turret. I don't think it actually dropped under 55FPS. If it was that low. I tried to keep track of the on screen FPS.

Maybe I should try using fraps and benchmarking this.

Do it, fraps is simple to use. Just install and when in game press F11, then play, press F11 to stop. Remember to stop it before you exit the game or it won't save anything.

It must be some kind of hardware/driver issue because with my single 670 my FPS stays at 60 fps 90% of the time. I have everything maxed with AO enabled, view distance is ultra high, and Physx is set to high.

I might have to test this through the caustic caverns. I am not sure If I am there yet.


Is SLI even scaling in this game at all? I am pretty sure it never worked in BL1.

It scales.
http://physxinfo.com/news/9653/borderlands-2-physx-benchmark-roundup/
gamegpu_border.jpg



2560x1600 with IB@4.4Ghz and two 680s it appears back and forth (assuming the SLI profile is decent).

With a single 680, it's firmly GPU limited.

Try run some benchmarks with fraps.


But the thing is... EVERYBODY is CPU limited in BL2.
Parts of PhysX code in certain situation are taking too much CPU resources/time, leaving rendering and other threads to wait for PhysX

Which part of that is Gearbox and which NV fault, your guess is good as mine.
I would not normally point finger at dev when dealing with PhysX, but it's obvious (DOF, AO) that Gearbox haven't been paying attention to always keeping the (CPU) resources in check.

Do you have a source on this? I've only read that it's on the CPU if the GPU is not Nvidia. Granted I don't think I have see reliable sources from gearbox or nvidia comment on this yet.


Here's a run through the bloodwing rescue boss area (wildlife refugee). 3 players.
Lowered the foliage distance to medium, and the Ambient occlusion off. All other settings max.
670 PE @~1230 core 6500 memory
Pqk9R.png
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Parts of PhysX being executed on CPU is common knowledge. For example Low PhysX, collisions, ragdolls are done on CPU regardless on NV/AMD

But no, I don't have source on any of that. Pretty much just a hunch. Me talking bunch of stuff :)
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Parts of PhysX being executed on CPU is common knowledge. For example Low PhysX, collisions, ragdolls are done on CPU regardless on NV/AMD

But no, I don't have source on any of that. Pretty much just a hunch. Me talking bunch of stuff :)

Think it depends on the game, game engine and the physX version.
Certain games need physx installed to run, even though they are not using the gpu at all. NFS shift and I think the first Borderlands are examples of this. The game engine have some implementation, needing the physX libraries, but the games play the same on Nvidia and AMD gpu's.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Think it depends on the game, game engine and the physX version.
Certain games need physx installed to run, even though they are not using the gpu at all. NFS shift and I think the first Borderlands are examples of this. The game engine have some implementation, needing the physX libraries, but the games play the same on Nvidia and AMD gpu's.

That is because it runs on the CPU...like in NFS - Shift and it happens when it's simple physics.
It's when you crank up the fidelity the GPU shines compared to the CPU.

Allthough PhysX got the blame for AMD's drivers in NFS - Shift ....funny story ^^
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Looking very much forward to Gearbox releasing a performance patch or Nvidia optimizing drivers more for this game. Or a combination of both. Whatever makes us able to not dip below 60FPS with PhysX on High.

After all, Claptrap wrote a love letter to PC gamers before Borderlands 2 was released :p They did something that should have been in every PC game that was released both for consoles and PC's - FOV adjustment. In most games there are some kind of workaround or ini file you can edit, but it's very nice to have an in-game slider for it, like in Borderlands 2.

Playing with console FOV on a desktop PC where I sit very close to the screen makes me feel sick and dizzy after a while.

For Borderlands 1 the community made the Borderlands Config Editor which could change FOV amongst other things. Without this the game would have been unplayable for me...
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's really annoying that PhysX is so unoptimized. How many years ago was Agea running it on a weak, by today's standards, PPU? It would be better for us if it wasn't single threaded and used ancient X86 instructions instead of something more suited on CPU. Then even nVidia owners could play games on a single GPU. Most computers have CPU power to spare when a program scales properly across a multi core CPU.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
It's really annoying that PhysX is so unoptimized. How many years ago was Agea running it on a weak, by today's standards, PPU? It would be better for us if it wasn't single threaded and used ancient X86 instructions instead of something more suited on CPU. Then even nVidia owners could play games on a single GPU. Most computers have CPU power to spare when a program scales properly across a multi core CPU.

It almost seems like they don't care. As long as people buy hardware and all the unoptimized console ports they just don't care.

Look at Skyrim.. It ran terrible even on high-end computers. A regular guy made a little file which optimized the game to take more advantage of modern CPU's, and woila, the game performed much better! It gave up to 20FPS more some places in the game! Just because a file made the game take advantage of our CPU's, like it should in the first place! Bethesda had not even tried the slightest to optimize the game for PC's. Fair enough, they did optimize it in the end, but that was a long time after the community had fixed it.

Also, I don't think game developers can blaim pirating in these days. All people I know use Steam. Steam is very nice and easy to use, and games are not that expensive. People who pirate cannot play online in most cases and have to get updates and patches manually, and find a crack for each patch. I don't think most people will bother all that stuff anymore.

Also, it used console FOV (could be fixed via ini tweaks, but why didn't Bethesda even care to adjust FOV for the PC version, or make it a in-game setting. How long would it have taken them to do this?!).

Game developers don't even bother to make menus with smaller fonts for the PC versions. Again, Skyrim is a fine example. It was also fixed by the community (SkyUi). How long would it have taken Bethesdas developer team to do this for the PC version, when one regular person can do it for free, in his sparetime?!

As I've said, it seems they don't care as long as they sell their games.

With all that said, Gearbox have shown they care about us PC gamers in many ways. So I only hope they are working on a performance patch.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
It's really annoying that PhysX is so unoptimized. How many years ago was Agea running it on a weak, by today's standards, PPU? It would be better for us if it wasn't single threaded and used ancient X86 instructions instead of something more suited on CPU. Then even nVidia owners could play games on a single GPU. Most computers have CPU power to spare when a program scales properly across a multi core CPU.

Try to empty the ocean with a bucket eh?
Why does Intel have " Intel Xeon Phi"?

Answer because something are much better calculated on GPU than a CPU.
The PPU was chip that didn't resemble a CPU very much, but more a GPU.

BTW, you sound like something dug up from the past:
http://physxinfo.com/news/5671/physx-sdk-3-0-has-been-released/

That was over a year ago.

Combined with this:
http://beyond3d.com/showpost.php?p=1451158&postcount=136

More importantly, this response from NVIDIA seems completely reasonable to me and I definitely don't think they are intentionally hurting CPU performance to make the GPU look good. The response of "most people write for console, port it to PC and it runs faster there so we don't look much more at it" is true in my experience. By NVIDIA's admission there's performance left on the floor but I doubt it's due to anything nefarious. Rather, it's just an increasingly dated code-base and game developer apathy. Sounds like they've got a good handle on this for PhysX 3.0 though so that will be good to see.

Their response talked about:
http://arstechnica.com/gaming/2010/...ts-cpu-gaming-physics-library-to-spite-intel/

Could you move into 2012, so the debste is made up of facts, not outdated ignorance?

Oh, and you still havn't provied onee shred of proff of this:

It's really annoying that PhysX is so unoptimized.

No arms...no cookie.

I find it annoying that AMD is still playing with it's crownjuvels when it comes to hardware physics...oh, and people living in the past and thinking it's a viable "argument"...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Try to empty the ocean with a bucket eh?
Why does Intel have " Intel Xeon Phi"?

Answer because something are much better calculated on GPU than a CPU.
The PPU was chip that didn't resemble a CPU very much, but more a GPU.

BTW, you sound like something dug up from the past:
http://physxinfo.com/news/5671/physx-sdk-3-0-has-been-released/

That was over a year ago.

Combined with this:
http://beyond3d.com/showpost.php?p=1451158&postcount=136




Their response talked about:
http://arstechnica.com/gaming/2010/...ts-cpu-gaming-physics-library-to-spite-intel/

Could you move into 2012, so the debste is made up of facts, not outdated ignorance?

Oh, and you still havn't provied onee shred of proff of this:



No arms...no cookie.

I find it annoying that AMD is still playing with it's crownjuvels when it comes to hardware physics...oh, and people living in the past and thinking it's a viable "argument"...

Read both links. sorry, but what's your point. The part you quoted states that it's optimized for consoles and ported to PC. That the performance is faster so they leave it alone.
By NVIDIA's admission there's performance left on the floor

That's because it's not optimized for PC.

BTW, you don't have to be so rude with your response. If you have a point, state it. If I've made a mistake, I want to know. I'm not an engineer nor a developer. I don't claim to be all knowing. I'm just here to interact, discuss, and learn. You should try talking to people like they are sitting in front of you. You are interacting with another human being.