Physx - Are you interested in it? Have your say! VOTE!

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Physx - rate the importance if you care or not

  • Physx - what's that?

  • Physx - no thanks! (Unimpressed)

  • Physx - neutral

  • Physx - nice extra if price / performance lines up.

  • Physx - factors in the decision

  • Physx - must have! (Diehard fan)


Results are only viewable after voting.

NIGELG

Senior member
Nov 4, 2009
851
31
91
Lol....got to trust those Nvidia corporate guys.


They are the epitome of honesty.:p
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Hehe, the product manager of GPU PhysX content and a technical artist may have a clue what is going on!
 

desura

Diamond Member
Mar 22, 2013
4,627
129
101
For me, PhysX is kind of mandatory for any PC games nowadays.

Otherwise, games are just...texture painted on polygons. It really adds to the experience, and one big way that PC games go beyond being just higher-rez versions of console games.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
For me, PhysX is kind of mandatory for any PC games nowadays.

Otherwise, games are just...texture painted on polygons. It really adds to the experience, and one big way that PC games go beyond being just higher-rez versions of console games.

I agree that having a physics engine is good. But I guess the issue I have is are you saying you want a physics engine instead of no physics engine? Or are you saying that of all the physics engines that may be used, I need to have PhysX specifically?

I just wonder, what if the game used, say, Intel's Havok physics engine instead of PhysX. Would you be just as pleased?

My impression is that some people are arguing the pros/cons of whether to have a physics engine at all, and the additional effects. Then other people are arguing whether a physics engine should be proprietary or not. I guess it seems like two separate issues, and the poll doesn't seem to be able to capture that, or maybe it just conflates both answers and muddies the results?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I agree that having a physics engine is good. But I guess the issue I have is are you saying you want a physics engine instead of no physics engine? Or are you saying that of all the physics engines that may be used, I need to have PhysX specifically?

I just wonder, what if the game used, say, Intel's Havok physics engine instead of PhysX. Would you be just as pleased?

My impression is that some people are arguing the pros/cons of whether to have a physics engine at all, and the additional effects. Then other people are arguing whether a physics engine should be proprietary or not. I guess it seems like two separate issues, and the poll doesn't seem to be able to capture that, or maybe it just conflates both answers and muddies the results?

The real question that should be asked is:

Do people want physics in their games? If the answer is yes, and it is, then let them use what they want.
 

desura

Diamond Member
Mar 22, 2013
4,627
129
101
I agree that having a physics engine is good. But I guess the issue I have is are you saying you want a physics engine instead of no physics engine? Or are you saying that of all the physics engines that may be used, I need to have PhysX specifically?

I just wonder, what if the game used, say, Intel's Havok physics engine instead of PhysX. Would you be just as pleased?

My impression is that some people are arguing the pros/cons of whether to have a physics engine at all, and the additional effects. Then other people are arguing whether a physics engine should be proprietary or not. I guess it seems like two separate issues, and the poll doesn't seem to be able to capture that, or maybe it just conflates both answers and muddies the results?

PhysX is more impessive than Havok, across the board IMO. Due to hardware acceleration as opposed to software.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You are repeating nvidia's false dilemma fallacy... "either nvidia officially supports feature X, or they write special DRM code to prevent its use".
This is a really poor false dilemma for them to set up because there are countless things they do NOT officially support that they haven't bothered DRMing against; by using this false dilemma they imply that anything they haven't specifically DRMed against is fully supported by them, a very bad position

Honestly to me it doesn't matter. I have no personal stake in whether NVidia allows or prevents hybrid setups. I was just stating their purported reasoning.

I wish AMD would license CUDA and then none of that would be necessary.

However, I think it's better from a strictly business perspective for NVidia to disallow hybrid PhysX setups, because it can be used as leverage to influence people to abandon AMD and go all the way with NVidia seeing as it's the only physics middleware with GPU support.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
By increased adoption in the context of a game adding it?

Yes.

Also NV sponsoring games != market acceptance. It's like saying gamers love amd due to the titles in ge increasing.

What does it matter? As long as the end result is an increase in the amount of game developers using GPU PhysX then it's all good.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I just found it to be a double standard as I mentioned in the example.
I wouldn't have said anything but reading a few pages back he was telling others to prove subjective statements.

Saying that NVidia purposely removes scripted animations from games and uses PhysX instead isn't exactly subjective.

In fact, I already disproved Bo Fox's assertion about Arkham Asylum had environmental fog in the Xbox 360 version, but not in the PC version.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Realism and immersion are not the same, or else every movie would have been made in the same way that Skyfall was made (crashing tons and tons of cars for the takes). The trend is in the opposite direction, cheating realism to create a different kind of immersion.
Some scenes like the famed Battlefield 3 "blow up the whole side of a Hotel" animation will probably always be more impressive (and easier to pull off) as a scripted scene compared to a dynamical one, simply because the game relies on a very exact result of your actions to make the reaction of the accompanying NPCs more believable.

Yes, there will always be a need for scripted events in video games. However, that doesn't lessen the importance of real time physics.

Be careful with that statement, The Witcher 3 may still use Havok after all and only use PhysX effects. Havok isn't just a physics engine, they can also supply AI, a scripting engine and other tools to complement your inhouse engine development..

I was obviously referring to game physics when I made that comment :p
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
However, I think it's better from a strictly business perspective for NVidia to disallow hybrid PhysX setups, because it can be used as leverage to influence people to abandon AMD and go all the way with NVidia seeing as it's the only physics middleware with GPU support.

All it is doing is driving people away from physX. you are describing the result of levering a monopoly, but you must have an actual monopoly before you can leverage it to drive out competition. Doing it too early merely sabotages your attempt to form a monopoly.

The ideal strategy would have been to allow it for now, and when physX is 90% of the market find an excuse to disallow it and watch AMD crash and burn.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
All it is doing is driving people away from physX. you are describing the result of levering a monopoly, but you must have an actual monopoly before you can leverage it to drive out competition. Doing it too early merely sabotages your attempt to form a monopoly.

The ideal strategy would have been to allow it for now, and when physX is 90% of the market find an excuse to disallow it and watch AMD crash and burn.

And what would you think would be the result of that happening?
Would there be much happiness and merrymaking? Nah. I'm leaning more towards law suits.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
And what would you think would be the result of that happening?
Would there be much happiness and merrymaking? Nah. I'm leaning more towards law suits.

The lawsuits will happen the day physX succeeded despite the handicap of improper monopoly leveraging attempt.
Given the choice between an improperly executed monopoly leveraging attempt, a properly executed one, and fair competition, I would personally prefer the fair competition. But if you have to try and leverage a monopoly at least don't be incompetent about it
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Ok, ironic people claiming effects were missing from physx games and only using physx get called out for proof, but claiming bl2 physx problems are from (insert random likely false reason), claiming physx is gaining market acceptance, etc. doesn't require proof.

I'm done with this but if people want to make stuff up just don't use double standards.

DSOgaming did an in-depth performance analysis on BL2:

Click here

For instance, take a look at the following screenshot. In that particular scene, our framerate was at low 50s in both our dual-core and our quad-core systems (max details with high view distance and low PhysX). Our GPU usage was at 60%, suggesting that there was a CPU limitation. We did not witness any significant improvements when we lowered all of our settings (but kept view distance at ‘High’). When we lowered the view distance to ‘medium’, our framerate increased to mid-50s and when we lowered it to ‘low’, our framerate jumped to 80s. This clearly shows that the ‘view distance’ setting is the most stressful one. It also shows that the game does not take advantage of four cores, as there weren’t any significant differences between a dual-core and a quad-core. Moreover, it also proves that there is an optimization issue here that will affect most PC systems, unless of course they can overcome those issues with their additional raw power. Ironically, it seems that the view distance is more of a CPU than a GPU setting, suggesting that Gearbox has offloaded the view distance setting to the CPU

Bolded select sentences. Those comments reflect my experiences with BL2. The game is poorly optimized for modern CPUs, and PhysX highlights that issue because the CPU is taxed even more for the extra amount of draw calls.

On my 4.5ghz 3930K, one core is almost always at 100% utilization, and another bounces between 75 and 100%. The rest of them show very little activity, and I'm sure that's from the OS and not the game.

So PhysX isn't the problem. It's the game :hmm:
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Is it? There is more content and nVidia has clear discrete leadership!

1. No, it does not have a clear discrete leadership. And the vast majority of the titles it supposedly does have are CPU physics not GPU physics.
2. More content? Did you peer into an alternate dimension where nvidia did not implement the DRM or even openly supported physX as a secardy card and observe fewer titles with GPU physX implemented in such dimensions then in our own dimension?
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
1. No, it does not have a clear discrete leadership. And the vast majority of the titles it supposedly does have are CPU physics not GPU physics.
2. More content? Did you peer into an alternate dimension where nvidia did not implement the DRM or even openly supported physX as a secardy card and observe fewer titles with GPU physX implemented in such dimensions then in our own dimension?

1.If having 65% market share isn't leadership I don't know what is.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
DSOgaming did an in-depth performance analysis on BL2:

Click here



Bolded select sentences. Those comments reflect my experiences with BL2. The game is poorly optimized for modern CPUs, and PhysX highlights that issue because the CPU is taxed even more for the extra amount of draw calls.

On my 4.5ghz 3930K, one core is almost always at 100% utilization, and another bounces between 75 and 100%. The rest of them show very little activity, and I'm sure that's from the OS and not the game.

So PhysX isn't the problem. It's the game :hmm:

Sure it's not multithreaded.

Talk about avoiding the direct issue, when you enable physx in 4 player battles it drops to slideshow FPS at points, however with physx low it runs fine.

The issue is physx, and whether it's single or multithreaded has nothing to do with it since it's running on the GPU.

This is precisely what I mean, you are trying to say you have some basis to claim physx on BL2 is the bad game engine, but your proof doesn't demonstrate that at all. :colbert:
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1.If having 65% market share isn't leadership I don't know what is.

1. show source that states 65% of video games are shipping with nVidia physX
2. CPU vs GPU physX, again, which you keep ignoring.
3. Still haven't clarified the alternate dimension observations
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
1. show source that states 65% of video games are shipping with nVidia physX
2. CPU vs GPU physX, again, which you keep ignoring.
3. Still haven't clarified the alternate dimension observations

1.Unimportant as you said they don't have the leadership.How many GE or TWIMTBP titles come out each year ? 25-30 at max.Most of them have some exclusive technology by either vendor.

2.CPU is not fast enough for Physx period.I find it funny that after so many years people still compare CPU vs GPU computing.There are certain tasks which will always run better on gpu.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1.If having 65% market share isn't leadership I don't know what is.
1. show source that states 65% of video games are shipping with nVidia physX
1.Unimportant as you said they don't have the leadership.How many GE or TWIMTBP titles come out each year ? 25-30 at max.Most of them have some exclusive technology by either vendor.
How is a source for your outlandish claim unimportant?
Also, "as you said" implies you are agreeing with me, but I am not certain that is what you are doing from that confusing statement. Are you or are you not agreeing with me?

2.CPU is not fast enough for Physx period.I find it funny that after so many years people still compare CPU vs GPU computing.There are certain tasks which will always run better on gpu.
CPU PhysX not CPU physics.
PhysX, nvidia brand name, has FREE for any developer, libraries for performing non intensive calculations EXCLUSIVELY on the CPU. over 90% of the titles nvidia lists as using "PhysX" DO NOT actually support GPU accelerated PhysX at all, instead they run exclusively some simple and non intensive physX on the CPU
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
How is a source for your outlandish claim unimportant?


CPU PhysX not CPU physics.
PhysX, nvidia brand name, has FREE for any developer, libraries for performing non intensive calculations EXCLUSIVELY on the CPU. over 90% of the titles nvidia lists as using "PhysX" DO NOT actually support GPU accelerated PhysX at all, instead they run exclusively some simple and non intensive physX on the CPU

1.Which is outlandish that they don't have 65% discreet share? I find it extremely funny.Google it.

2.You answered your own question.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1.Which is outlandish that they don't have 65% discreet share? I find it extremely funny.Google it.

I thought you were claiming nvidia physX has 65% of the market, aka, 65% of the video GAMES ship with GPU physX. An outlandish claim.

Apparently you just posted the % of discrete video cards in existence which were made by nvidia (with some rounding, first result on google says its 62%). And claimed this as the GPU PhysX adoption rate, even though it has absolutely nothing to do with it.

I did not notice that discrepancy, and for some reason in the 4 posts we each made since arguing the subject you have not seen fit to clarify.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Sure it's not multithreaded.

Talk about avoiding the direct issue, when you enable physx in 4 player battles it drops to slideshow FPS at points, however with physx low it runs fine.

It also drops to slideshow FPS in single player with PhysX on high in certain locations, so what's your point?

The issue is physx, and whether it's single or multithreaded has nothing to do with it since it's running on the GPU.

Wrong. You've shown that you don't understand the relationship between CPUs and GPUs in games. Just because PhysX uses the GPU for calculations, doesn't mean the CPU has no part to play.

The CPU issues draw calls for everything that is rendered by the GPU on the screen. With PhysX enabled, the amount of draw calls is increased big time due to all of the added particles and effects.

The engine lacking multithreaded capabilities loads all of the draw calls on only ONE core, which isn't enough for PhysX on medium or high. So the CPU gets taxed to the limit (up to 100% usage on two cores on my 4.5ghz 3930K) and eventually gets bogged down when the draw call limitation is exceeded, which causes the GPUs to stop being fed data and the frame rate to plummet.

This is precisely what I mean, you are trying to say you have some basis to claim physx on BL2 is the bad game engine, but your proof doesn't demonstrate that at all. :colbert:

It does demonstrate it, you just don't understand it :p

Gearbox should never have used DX9 for this game. DX9 uses a single core for rendering if I'm not mistaken, which destroys performance in a big game like BL2 with massive draw distances and detail.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I thought you were claiming nvidia physX has 65% of the market, aka, 65% of the video GAMES ship with GPU physX. An outlandish claim.

Apparently you just posted the % of discrete video cards in existence which were made by nvidia (with some rounding, first result on google says its 62%). And claimed this as the GPU PhysX adoption rate, even though it has absolutely nothing to do with it.

I did not notice that discrepancy, and for some reason in the 4 posts we each made since arguing the subject you have not seen fit to clarify.

Glad we got it clarified, I thought you meant NV didn't have the discreet leadership.