Borderlands 2 benchmarks (& Physx)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Have any of you tried overclocking your processors? I see several of you have low clocked SB systems, PhsyX requires cpu cycles even if you're offloading the workload to a gpu.

PhysX uses SSE2 on a x86 processor, it is considered intentionally crippled by many because of the old instruction set it uses on processors.

You can document this?
How many apps today still uses SSE2 and not eg. AVX?
You don't know anything about programming do you?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Yeah, it depends on gaming style and how many players/elements there are. I play multiplayer currently with the assassin and melee a lot and get right in the midst of the action. There is hardly a map or big fight without a major slowdown even on a 690. Everytime the fragments are flying and physx pools of goo are forming and exploding the fps tank (not just one explosion but a bunch of action).

When I sit back and snipe and don't get quite as close or with less players it's pretty smooth sailing.

And I agree, physx adds a nice effect. I too use it on high, I've just been disappointed the fastest gpu doesn't suffice and I get slowdowns.

So you still havn't played Crysis right?

Because it still dosn't get +60 FPS...

Damn that GTX690 is a dissapointment....Can't run a 2008 game at max setting +60 FPS...:whiste:

You expectations need recalibrating with the real world.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
You can document this?
How many apps today still uses SSE2 and not eg. AVX?
You don't know anything about programming do you?

No I can't document anything, only state what others think. It is not my opinion if that is what you thought. PhysX is old and clunky, that is why they've revamped it and eliminated much of the old code with 3.0, BL2 is still using 2.x for PhysX. So in a sense you could make a case for it being "unoptimized" since there is a better version, however you wouldn't get very far with it imo.

You're lashing out pretty hard, perhaps it's time to take a break I know what it's like to have people who can't grasp simple concepts come at you in droves. It's not worth the stress, let them continue to think as they wish it won't make a bit of difference in the end if they agree with you or not.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Ok troll, what did I fail to prove?

That there is a problem with anything but your selfinvented "bar".

The only point I have made and asserted with proof is that the gtx 690 is incapable of handling physx in borderlands 2. Some suggested their 670/680's handled it fine, so I bought a gtx 670 PE to test if it were some obscure SLI bug, which it wasn't.

You seem to fail in understaing how each piece of the "puzzle" works and interacts:
SLI
PhysX
CUDA
CPU
GPU
DirectX

This the turns into "I AM DISSAPOINT!!!"

But not because of facts and insight...quite the oppsite.

This = fail.
Yeah, perhaps whether it was single core vs. multi core or x86 vs. x87 and some other details have been lost or incorrect. Just because it uses SSE etc. doesn't mean it's optimized.

Yes, you fail yet again.
Were is the data to support the assertion that PhysX is unoptimized?
All the benches in this thread don't do anything in that aspect.
No evidencesupporting this claims has been presented.

And it wasn't "lost".
It the same old story....false FUD being used as "arguments".

If a 690 can't render physx then how optimized can it be?

There is no corrolation in your "comparison".
A GTX690 can render PhysX.
Where do you get the assertion that a GTX690 (since you won't answer about the GTX670?) should run Borderlands 2 max settigns +60 FPS?

Who cares if it technically uses certain instructions if it can't even be ran on a GTX 690.

People who want the full picture and not, like you, miss most of the "pieces" in the "puzzle" and thus not understand the "image"...and thus will make claims based on ignorance, not knowlegde or fact.


/end of pointless discussion. People will read this thread, glance at the facts, ignore trolling and make up their own minds.

Why should they ignore your posts?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
No I can't document anything, only state what others think. It is not my opinion if that is what you thought. PhysX is old and clunky, that is why they've revamped it and eliminated much of the old code with 3.0, BL2 is still using 2.x for PhysX. So in a sense you could make a case for it being "unoptimized" since there is a better version, however you wouldn't get very far with it imo.

Paradox deteced...

You're lashing out pretty hard, perhaps it's time to take a break I know what it's like to have people who can't grasp simple concepts come at you in droves. It's not worth the stress, let them continue to think as they wish it won't make a bit of difference in the end if they agree with you or not.

FUD needs to be slammed hard...it dosn't belong in a technical forum.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
The troll want proof there are problems with PhysX optimization in Borderlands 2? Well, this is from the comments in The Borderlands 2 Tweak Guide

I believe Andrew Burnes wrote that guide, and he works for Nvidia.

Gutterhulk • 18 days ago −
What about the performance issues with PhysX 6XX series? mainly GTX680, is there any estimated time on patch for game and\or new drivers?
1 •Reply•Share ›

Andrew Burnes MOD • 17 days ago • parent
There is a performance-enhancing patch in the works from the fine folks at Gearbox.
1 •Reply•Share ›
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The troll want proof there are problems with PhysX optimization in Borderlands 2? Well, this is from the comments in The Borderlands 2 Tweak Guide

I believe Andrew Burnes wrote that guide, and he works for Nvidia.

Gutterhulk • 18 days ago −
What about the performance issues with PhysX 6XX series? mainly GTX680, is there any estimated time on patch for game and\or new drivers?
1 •Reply•Share ›

Andrew Burnes MOD • 17 days ago • parent
There is a performance-enhancing patch in the works from the fine folks at Gearbox.
1 •Reply•Share ›

That is nothing that hasn't been normal pratice since the PPU days:
http://www.anandtech.com/show/2009/5

Or since GPU driver updates.

That link dosn't support your claim as you think.

Lon, you're done in this thread
-ViRGE
 
Last edited by a moderator:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Paradox deteced...



FUD needs to be slammed hard...it dosn't belong in a technical forum.


It's not a paradox, it's a simple fact. 2.x contains a lot of legacy code which has been removed and improved with 3.0.

The stretch is does Windows 7 become unoptimized because Windows 8 is faster? Depending on your point of view the answer can be either yes or no, neither is wrong imo.

FUD, that's funny.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
That is nothing that hasn't been normal pratice since the PPU days:
http://www.anandtech.com/show/2009/5

Or since GPU driver updates.

That link dosn't support your claim as you think.

Give it a rest. The link I posted proofs a Nvidia employee, who wrote the Borderlands 2 Tweak Guide is aware of the optimization problems with regards to PhysX and Borderlands 2, and that he has been in contact with the game developer and they have stated they are working on a performance patch to solve the PhysX optimization problems in Borderlands 2.

What more proof do you want? Honestly, I don't care. Go troll elsewhere.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Give it a rest. The link I posted proofs a Nvidia employee, who wrote the Borderlands 2 Tweak Guide is aware of the optimization problems with regards to PhysX and Borderlands 2, and that he has been in contact with the game developer and they have stated they are working on a performance patch to solve the PhysX optimization problems in Borderlands 2.

What more proof do you want? Honestly, I don't care. Go troll elsewhere.

I second this.

I don't claim to know "physx" internals. I am merely relaying to the world my experiences.
Troll comes in and tries to twist everything as if I care about physx's internal workings. Claims physx is optimized. Argues to the death. Sounds like he's defending physx as if he's coded it himself and has to convince the world how optimized it is. NVIDIA themselves are working on a patch and admit there is a performance issue.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Honestly, I don't know and I don't care. I stick to what has been stated in my previous post. Why not ask Nvidia or Gearbox yourself?

Well I'm not the one connecting dots between a Gearbox performance update and unoptimized PhysX performance.


Before I went and blamed PhysX, I'd ensure my cpu was actually capable of feeding my gpu and physx processor during multiplayer gameplay. Which I have yet to see within this entire thread.


Edit: I'd like to add that it could very well be that a GTX 680 can not handle High PhysX in a MP game. This does not make the jump from the 680 isn't fast enough, to PhysX is unoptimized. What it means is there is too much workload for the 680 to handle, nothing more.

After you've made that determination you could go other places, but until you can produce actual proof of fact that the root cause is PhysX is unoptimized (good luck) there is no reason to go there. A better place to go would be that nVidia implemented the feature poorly and did so without considering the ramifications of MP workloads compared to SP.
 
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Well I'm not the one connecting dots between a Gearbox performance update and unoptimized PhysX performance.


Before I went and blamed PhysX, I'd ensure my cpu was actually capable of feeding my gpu and physx processor during multiplayer gameplay. Which I have yet to see within this entire thread.

I have the 670 installed atm. I'll update when I get some new benches with dual GPU and an oced 980 @4.2GHz.
 
Last edited:

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Well I'm not the one connecting dots between a Gearbox performance update and unoptimized PhysX performance.


Before I went and blamed PhysX, I'd ensure my cpu was actually capable of feeding my gpu and physx processor during multiplayer gameplay. Which I have yet to see within this entire thread.

Did the Nvidia guy who wrote the Borderlands 2 Tweak guide say anything about a 4.4GHz 2600K was not enough?

I quote what I posted earlier, and more proof than this is not possible at this point:

Gutterhulk • 18 days ago −
What about the performance issues with PhysX 6XX series? mainly GTX680, is there any estimated time on patch for game and\or new drivers?
1 •Reply•Share ›

Andrew Burnes MOD • 17 days ago • parent
There is a performance-enhancing patch in the works from the fine folks at Gearbox.
1 •Reply•Share ›
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Did the Nvidia guy who wrote the Borderlands 2 Tweak guide say anything about a 4.4GHz 2600K was not enough?

I quote what I posted earlier, and more proof than this is not possible at this point:

Gutterhulk • 18 days ago −
What about the performance issues with PhysX 6XX series? mainly GTX680, is there any estimated time on patch for game and\or new drivers?
1 •Reply•Share ›

Andrew Burnes MOD • 17 days ago • parent
There is a performance-enhancing patch in the works from the fine folks at Gearbox.
1 •Reply•Share ›


No, I said it.

nVidia implemented PhysX into the game and Gearbox didn't put a single like of PhysX code into their entire game.

Your "proof" seems to have some pretty big holes as far as any indication of PhysX performance increases being directly incurred via Gearbox performance enhancements.


What could happen, is Gearbox has some unoptimized code that is stalling processors in MP, causing gpus to not get fed as well as they could be. The indirect result is of course freed cpu cycles can be used to produce more fps by allowing your gpu and ppu to do more work. That of course is just a guess, I offer no proof of fact, only as said a guess.
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
No, I said it.

nVidia implemented PhysX into the game and Gearbox didn't put a single like of PhysX code into their entire game.

Your "proof" seems to have some pretty big holes as far as any indication of PhysX performance increases being directly incurred via Gearbox performance enhancements.


What could happen, is Gearbox has some unoptimized code that is stalling processors in MP, causing gpus to not get fed as well as they could be. The indirect result is of course freed cpu cycles can be used to produce more fps by allowing your gpu and ppu to do more work. That of course is just a guess, I offer no proof of fact, only as said a guess.

I run a 2600K at 4.4GHz. I've seen people with lesser clocks and higher clocks have the same framedips. That is not the problem.

Could be like you're saying here:
What could happen, is Gearbox has some unoptimized code that is stalling processors in MP, causing gpus to not get fed as well as they could be.

Only, it does happen in singleplayer too.

But who cares? We must stick to what has been said. Namely, Gearbox is working to sort out the PhysX performance. I don't care what causes it or who fixes it, only care it get fixed somehow. And again, I stick to what has been said from the Nvidia employee. What else can we do? We can just wait and see...
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
They get the exact same fps with the exact same hardware?

You're missing a key part here, how is Gearbox fixing PhysX performance when they never implemented it in the first place? I never saw anything about PhysX in his reply, only that Gearbox is working on a performance increase, which would indicate on their side of the code - meaning not PhysX.

Same FPS dips caused by the CPU or GPU? You still haven't figured that out, this thread will not move forward until you can produce proof as to weather it's cpu or gpu limited during those dips.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I run a 2600K at 4.4GHz. I've seen people with lesser clocks and higher clocks have the same framedips. That is not the problem.

Could be like you're saying here:
What could happen, is Gearbox has some unoptimized code that is stalling processors in MP, causing gpus to not get fed as well as they could be.

But who cares? We must stick to what has been said. Namely, Gearbox is working to sort out the PhysX performance. I don't care what causes it or who fixes it, only care it get fixed somehow. And again, I stick to what has been said from the Nvidia employee. What else can we do? We can just wait and see...

This is exactly the point as far as I'm concerned. Whether it's physx itself, or gearboxes implementation of physx idk/c. Hopefully they will resolve the performance.
 
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I hardly ever play single player so I mostly refer to multiplayer but others have said the same for single player.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
In SP it dips occasionally but for the most part its fine.Also guys please take a break, this thread has turned into something other than tech discussion.The only way to test if Physx is really optimized or not is to use a profiler which I believe is beyond most of the gamers.Thats is why fps data is presented to validate the point.Its not totally accurate but its far more easier
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
What about the work nVidia did for PhysX 3.0 and above?

http://physxinfo.com/news/5671/physx-sdk-3-0-has-been-released/

SirPauly, is it too hard to post the parts from the link that shows your point? I asked Lonbjerg to do this as well. I wasn't surprised when he did it, it's not like you though.

So, after reading 2 articles and a lengthy dev post I did find where it can be coded to run on multiple CPU cores. I don't see where they state that it doesn't use X87 instructions and uses SSE/SSE2, or anything else.

x87:
Results for 3000 fall: 19.562317
Results for 1000 stack: 12.707859
Results for 136 ragdolls: 10.471497
Results for 1000 convex: 18.448212
Results for prim-trimesh: 9.196541
Results for convex-trimesh: 16.437344
Results for raytests: 22.529740

/arch:SSE2 + USE_SSE
Results for 3000 fall: 17.373240
Results for 1000 stack: 11.402561
Results for 136 ragdolls: 9.099781
Results for 1000 convex: 14.487597
Results for prim-trimesh: 8.132652
Results for convex-trimesh: 13.579372
Results for raytests: 18.672155

This is from the link that Lonbjerg provided, I assume to show that SSE instructions are worthless. This is with Bullet Physics, so I'm not sure how applicable it is to PhysX. The author seems to think it's applicable, though. That's an 18% or 19% improvement on average. That's an appreciable optimization.

Then this quote the author is attributing to nVidia:
The response of "most people write for console, port it to PC and it runs faster there so we don't look much more at it" is true in my experience. By NVIDIA's admission there's performance left on the floor but I doubt it's due to anything nefarious.

That is a perfect definition of unoptimized.

Then there are posts like THIS ("I have plenty of CPU lead way the game is just buggy.

Most cpu use i see is 50% ") where CPU usage is reported as low, but frames are still dropping pretty dramatically. This is also a sign of unoptimized.

Just to be clear, next time either of you post a link as your proof without referencing the part that proves your point, I will ignore it. I made a comment, one point was wrong, and I end up taking abuse rather than someone simply pointing it out. I'm also forced to defend the entire post when all I can find is multithreading for PC referenced to disagree with what I posted. It's an asinine way to deal with other community members.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
PhysX has CPU multithreding since v2.

SEE2 has been supported since 2.8.4.

BL2 uses 2.8.4.something