Interesting take on Kanters article PhysX87: Software Deficiency

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
4.5 years is a pretty long time in the PC world.
For something to still be in its infancy 4.5 years after it began isn't very good going.

Hasnt tesselation been around on a GPU since 2001-02? It is still in its infancy.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Really ? When I enable physx in Mafia 2 on my system, it literally will cut my framerate by 50%. All the benchmarks out there of physx on high in this game show the same.

Dosn't mean it's unplayable, "Mafia II" runs fine on max settings on my rig.
That physics is computional very heavy is no secret...that why CPU's cannot mathc GPU's in physics calculations...*hint-hint*

Considering the small amount of visual quality addition it actually brings to the game, it's a ridiculous performance hit. It's the same sort of performance hit you'd get from going from a resolution of 1680x1050 to 2560x1600 on the same video card setup.

The visual part dosn't do justice to the amount of calculations needed to be performed, so you are looking at the tail whilst missing the entire body.

That is what makes GPU physx a terrible feature and a non-starter in its current state. For such a small change in what you notice on your screen there is no reason for such huge performance costs. Until they fix that, gpu physx will continue to be a feature you see in one new game a year and something the buying market does not care about.

Those "small" changes does impact your brain.
When stuff is scripted (static, repetative) you brain raises a red flag and suspension of disbelief is killed.
When stuff acts like in the real world...no flag is raised...and the brain accepts what it sees a "real".

And again, there is no difference between PPU, CPU, GPU and PPC PhysX.
Only thing different is the amount og GFLOPS you have at your disposal...which again impact what types of physics calaculations you can do with acceptable performace...it has already been stated enough in this thread, please take notice.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Not everyone is going to be using $1000 worth of video card hardware, as much as nvidia would like that.

And I generally got about 40fps with it on high, not 60.

Credit to nvidia for riding physx in on the coat-tails of a good game, it was a smart move, still doesn't make the feature any less crap.

You are blaming your hardware and it's lack of computional power on PhysX?
Do you complain about a WV Polo not running as fast a as NASCAR too? :hmm:

The same thing was said about Crysis (badly coded, ya-da-ya-da..) but no other games does the same amount of rendering.
Back then people also blamed Crysis for the inadequacies of their midrange hardware.
(BTW, I never think a GTX285 cost $1000..not even here in Denmark, with insane taxes.)

PhysX is doing what software always have done on the PC.
It pushes the hardware to the limit, thus futhering development.

Again a lot of false presumptions leading to flawed arguementation.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Since when has PC gaming been assessed on the standards of consoles ?

Many times it was not playable, feel free to enable physx on high in Mafia 2 and turn up the rest of the graphics settings, then come and share your experience. I'd like to hear your results.

I just completed "Mafia II" today.
My rig is a i7 920 @ 3.5Ghz with a BGF GTX285 OC.
I had no issues.

Mafia 2 and the physx in it helped to further my opinion of physx being a resource pig considering the minimal additions it brought to the game. It's supposed to be the best example of it yet, is it not ?

I must ask you, because I need to be certain:
You have no idea how much computaion physcis require do you?

I got a good chuckle when I walked around in a room and Awkward Looking Rock Chunk #754 jumped up from the floor and hit the ceiling when I walked over it.

Even more of a chuckle when I saw my framerate tank in a game that looked on par with a console port graphically, but some extra debris on the ground necessitated a 50% performance hit. At that point I turned physx off and enjoyed the rest of the game and didn't notice much difference.

That you somehow are not able to notice interactive smoke, interactive debris, interactive cloth ect. dosn't say anything about PhysX, it only speak about you.

And how (since you disabled PhysX) didn't you notice any difference?
Are you psychic?
Because with out comparing PhysX on/off...you cannot make tham claim with any truth.

I have tried several savegames with and without PhysX, here is what I noticed:

  • Clothes stick in an unnatural way to characters and behaves way to stiff.
  • The prerendered smoke looks static and false.
  • The missing debris makes the world look static and dead...matter just disspear, like in a freaky strange dimension.
  • Explosions don't interact with surroundings


There are only about 10 games that use GPU physx, so there a very limited base of games to draw examples from. Some of those are not even true games but free demos from nvidia or one level in a game.

How many DX11 games are there?
And how does DX11 alter the experience?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
That's the beauty of this entire situation. You had the choice to enable or disable it, and you chose to disable it. At least we're given a choice, rather than someone, somewhere up high saying "50% is too much! Just get rid of it!"

Didn't you get the second memo?
If someone dosn't like PhysX...no one should get it! ;)
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
That's the reason why, for me, Physics is the next frontier really, considering resolution and rendering are slowly hitting walls. Games are really static and the idea of dynamic movement is really compelling to add fidelity, realism, game-play and improve the gaming experience.

It just takes time but as long as things are moving forward on the GPU and CPU; it's all good to me. Don't expect idealism where idealism doesn't exist.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
How many DX11 games are there?
And how does DX11 alter the experience?

In short, DX11 can make a game noticeably more realistic looking. PhysX not so much(at least in the current implementations). The fact that you have countless people asking what does enabling PhysX even do when they enable it and don't see much of a difference further proves that point.

It doesn't really make PhsyX look good when you say it's doing a huge number of calculations on various debrees and things and that is why the performance hit is so severe when the actual visual difference is very minuscule; that's called a waste of resources.

As far as PhysX being deliberately crippled on the CPU, of course it is. You'll never encounter more easy to make multi-threaded code than PhysX code(they have no problem running it on hundreds of cores in the GPU) yet they force it to run on just one thread even on 12 thread CPUs. PhysX performance on the CPU could be increased by hundreds of percent if that is what Nvidia's goal was.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
4.5 years is a pretty long time in the PC world.
For something to still be in its infancy 4.5 years after it began isn't very good going.

I kinda count when nVidia purchased Ageia as the time-line and their vision, but others may differ.

Am disappointed though in the amount of titles and was hoping for 1 AAA title every month.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
I just completed "Mafia II" today.
My rig is a i7 920 @ 3.5Ghz with a BGF GTX285 OC.
I had no issues.

What were your framerates ? I definitely had playability issues and am running a setup over 200% more powerful.


I must ask you, because I need to be certain:
You have no idea how much computaion physcis require do you?

I'm not denying or claiming knowledge as to the amount of computations required for physx. It definitely has a whole lot going on as it is unplayable at times in its current most implemented form, high settings in Mafia 2.

That is irrelevant though. What does the amount of computations matter to someone playing a game with it enabled when it brings little to the table and cuts framerates in half. Sound like it needs a lot of work then.



I have tried several savegames with and without PhysX, here is what I noticed:

  • Clothes stick in an unnatural way to characters and behaves way to stiff.
  • The prerendered smoke looks static and false.
  • The missing debris makes the world look static and dead...matter just disspear, like in a freaky strange dimension.
  • Explosions don't interact with surroundings
I've seen all those effects implemented to one degree or another in games using physics on the CPU. The reason you don't see any of that in Mafia 2 when you disable physx is because they gutted any physics implementation once you disable physx.


How many DX11 games are there?
And how does DX11 alter the experience?

What does this have to do with anything ?

Physx has been around for over 4 years. GPU physx is averaging 2 games a year. Why is it not being adopted more widely ?

In a game like Crysis back when it was released, when you booted it up that first time and put everything on very high in DX10 with some AA and got poor framerates even on the best hardware, but what you did see was incredibly impressive, amazing visuals. One could accept the poor performance because the visual immersion was amazing.

When you load up Mafia 2 with physx on high and get poor performance it makes no sense, because there is nothing amazing going on and it is not bringing a level of immersion you have not seen before.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
That is irrelevant though. What does the amount of computations matter to someone playing a game with it enabled when it brings little to the table and cuts framerates in half. Sound like it needs a lot of work then.

These kinds of things don't bother me and expect hits. There are hits with ambient occlusion or tessellation, AA, Filtering -- and in their earlier days, AA and AF were much, much bigger hits yesteryear than they are today.

If one feels the hit is too much and a feature degrades their subjective gaming experience -- no one is forcing anyone to use it and can simply turn it off.

I do agree PhysX as a whole does need more work, time, evolve and mature and may not be ideal for all. But, it is a choice, trying to innovate Physics and bring some value for GeForce owners. To try to get the ball rolling on GPU Physics at least and get content in there.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Hasnt tesselation been around on a GPU since 2001-02? It is still in its infancy.

It failed.
Then it went away and came back in a new guise in a manner which gave standardisation and allowed it to be supported by anyone who wanted to make a DX11 compatible graphics card.
When it was in its infancy, it had no standards and wasn't part of a standard API which anyone could make hardware for.

See how it goes?


I kinda count when nVidia purchased Ageia as the time-line and their vision, but others may differ.

Am disappointed though in the amount of titles and was hoping for 1 AAA title every month.

Any particular reason why that's your timeline?
Bit name support came pre-NV (through Unreal Engine integration).

Granted, the improved visuals aren't the holy grail of game physics, but this is an excellent first step
http://www.anandtech.com/show/2001/5 - 2006, first step. 4.5 years ago.

It's been at the same stage since the beginning.
We're always at the start. It's always extra graphical effects first, with the goal to do "proper" stuff later (stuff that really impacts the gameplay, and not the graphics).
Maybe if PhysX becomes available on any GPU, we will start again with the 3rd beginning of PhysX and a new vision. That should give people a few extra years of thinking pretty graphics are going to be the beginning of good PhysX use in games.

That's not to say graphical improvements are never worthwhile, but we've been at the same stage for all this time (or maybe it's even slipping backwards, since most people seem to focus on the aesthetic features and don't care for real changes through physics).
The original vision of Ageia (as written in an AT article somewhere that I can't find, but I have quoted before in a thread somewhere) is that prettying up games would be a start. 4.5 years later on, we're still talking about that same start.

Maybe when the 3.0SDK is released and (on PC at least) PhysX baseline performance is increased through "optimisations" (making it run properly as the standard implementation), developers (for PC only games at least) will start using PhysX in more impressive ways, and then we will see GPU PhysX offering improvements not through added graphical niceness, but offering it through increased performance by offloading core physics calculations from the (slower) CPU onto the GPU.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
I do agree PhysX as whole does need more work, time, evolve and mature and may not be ideal for all. But, it is a choice, trying to innovate Physics and bring some value for GeForce owners. To try to get the ball rolling on GPU Physics at least and get content in there.

I'd have a much better opinion of physx if it was not so taxing. As it stands now it looks as if nvidia needs 2 or 3 years to make hardware that can handle running it in its current form. And that would be dependent on games being no more demanding in any other way graphically than they are today.

If they took physx to a level where it was truly game changing and brought some real immersion; for example, you walk into a building and can blow a wall away with your big weapon and then walk through this opening, or say, smash a table to pieces and then pick up one of the legs and proceed to beat your enemies with it. I think the performance hit would be insane going on how big it is now.

I also think that there is going to be no forward progress with gpu physics game adoption as long as it is not something available to any video card, regardless of who made it. I think gpu physics will likely become used more widely, but I don't see physx being the standard.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
This is impossible to prove.
If you know anything about the inner workings of physics simulations, you'll probably agree with me. It's not a deterministic process, but an iterative process.
The amount of iterations has a huge effect on both performance and stability/accuracy of the solution.
So doing an apples-to-apples comparison between two different physics APIs is impossible.
I'll give you impossible to prove. Such situations will also not be great for Altivec, either. It's not like if they implemented it in a way optimized for modern CPUs, and an updated version of Mafia II came out, I'd be able to play it with CPU PhysX--the non-gameplay effects are too much, even if it could provide double performance. But games where it is necessary would be able to run much better, and be able cram more effects in, before reaching some point of unacceptable performance.

Meanwhile, if the GPU is so much better for it, we need to see real uses of that, too. Fancy cloth, right next to doors, chairs, and walls that don't respond in any way breaks its possible immersion benefit within minutes.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
It failed.
Then it went away and came back in a new guise in a manner which gave standardisation and allowed it to be supported by anyone who wanted to make a DX11 compatible graphics card.
When it was in its infancy, it had no standards and wasn't part of a standard API which anyone could make hardware for.

See how it goes?

I dont think Nvidia will have a problem supporting a standard whether it is in OpenCL or a Microsoft standard. AMD on the other hand....................
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I'd have a much better opinion of physx if it was not so taxing. As it stands now it looks as if nvidia needs 2 or 3 years to make hardware that can handle running it in its current form. And that would be dependent on games being no more demanding in any other way graphically than they are today.

If they took physx to a level where it was truly game changing and brought some real immersion; for example, you walk into a building and can blow a wall away with your big weapon and then walk through this opening, or say, smash a table to pieces and then pick up one of the legs and proceed to beat your enemies with it. I think the performance hit would be insane going on how big it is now.

I also think that there is going to be no forward progress with gpu physics game adoption as long as it is not something available to any video card, regardless of who made it. I think gpu physics will likely become used more widely, but I don't see physx being the standard.

Where you one of those people who hated Anistrophic filtering and AA when it first came out becuase it taxed the hell out of the system?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I dont think Nvidia will have a problem supporting a standard whether it is in OpenCL or a Microsoft standard. AMD on the other hand....................

Uh, OK.
So they won't have a problem with it, they just aren't doing it. While AMD would have a problem with it, based on you saying so.
I'm sold.

That's why we see a port of PhysX to OpenCL/DirectCompute in the works.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I'd have a much better opinion of physx if it was not so taxing. As it stands now it looks as if nvidia needs 2 or 3 years to make hardware that can handle running it in its current form. And that would be dependent on games being no more demanding in any other way graphically than they are today.

If they took physx to a level where it was truly game changing and brought some real immersion; for example, you walk into a building and can blow a wall away with your big weapon and then walk through this opening, or say, smash a table to pieces and then pick up one of the legs and proceed to beat your enemies with it. I think the performance hit would be insane going on how big it is now.

I also think that there is going to be no forward progress with gpu physics game adoption as long as it is not something available to any video card, regardless of who made it. I think gpu physics will likely become used more widely, but I don't see physx being the standard.

It's your opinion and perfectly fine by me and respect the view. You don't like the hit and certainly would like to see it improve. PhysX may never be a standard but that's not the point for me. One has GPU processing abilities in the chips --- they can offer GPU physics -- why not try to offer content for their customers? It's about being pro-active and trying and this is what I appreciate.

I see this ripped apart and all it is-is a choice. nVidia is spending their resources, has all the risk, all the accountability and responsibility and can understand their view to some degree. Wish things were much more ideal for all but just have a strong desire to see Physics evolve and mature to bring more to gaming. This is another step forward to this.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Uh, OK.
So they won't have a problem with it, they just aren't doing it. While AMD would have a problem with it, based on you saying so.
I'm sold.

That's why we see a port of PhysX to OpenCL/DirectCompute in the works.

Download AMDs drivers and try to have OpenCL out of the box without having to install an SDK. Do the same for Nvidia. Nvidia is and has been supporting OpenCL far more than AMD. Scali can certainly chime in on this. That is what I base my opinion on for this matter.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Download AMDs drivers and try to have OpenCL out of the box without having to install an SDK. Do the same for Nvidia. Nvidia is and has been supporting OpenCL far more than AMD. Scali can certainly chime in on this. That is what I base my opinion on for this matter.

So nothing to do with PhysX at all?
Just checking because I was confused, since this thread is about PhysX, and NV recently said there was no reason for them to port PhysX to OpenCL.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
In short, DX11 can make a game noticeably more realistic looking.

Show me.
If I should put on a "Anti-DX11"-hat I would say that if we look at AvP...DX11 looks like a joke.

PhysX not so much(at least in the current implementations). The fact that you have countless people asking what does enabling PhysX even do when they enable it and don't see much of a difference further proves that point.

No, it just makes you wonder if some people would notice lack of AA or AF eg...when they can't notice 10.000 added dynamical particles.


It doesn't really make PhsyX look good when you say it's doing a huge number of calculations on various debrees and things and that is why the performance hit is so severe when the actual visual difference is very minuscule; that's called a waste of resources.

Again, you are the one (alongside a list of people using false and uniformed statements ) claming the difference is miniscule.
I suspect the claims comes more from vendor bias than anything else.
They are easy to dispell anyways:
http://www.youtube.com/watch?v=_iuVdLl5CIM

If you can't see the difference, something is wrong on your end.

This here is something I really like:
http://www.youtube.com/watch?v=qiJogb-k230

Not to mention explosions:
http://www.youtube.com/watch?v=w4FYJh6CCDM

You call the difference "miniscule" again :D

As far as PhysX being deliberately crippled on the CPU, of course it is. You'll never encounter more easy to make multi-threaded code than PhysX code(they have no problem running it on hundreds of cores in the GPU) yet they force it to run on just one thread even on 12 thread CPUs. PhysX performance on the CPU could be increased by hundreds of percent if that is what Nvidia's goal was.

Why do you post false claims that have been debunked more than once in this thread?
Come on...you are running in circles now...
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
What were your framerates ? I definitely had playability issues and am running a setup over 200% more powerful.

I'll post FRAPS number when I come home.
and by 200%...do you mean multi GPU?
Because that "solution" got issues of it's own that merrits a whole new thread.




I'm not denying or claiming knowledge as to the amount of computations required for physx. It definitely has a whole lot going on as it is unplayable at times in its current most implemented form, high settings in Mafia 2.

It takes a LOT.
Like Raytracing takes a lot over prebaked ligthing/lightmaps.

And that is the end goal.
A perfect simulation of the real world.
And we are getting there...one baby step at the time.

That is irrelevant though. What does the amount of computations matter to someone playing a game with it enabled when it brings little to the table and cuts framerates in half. Sound like it needs a lot of work then.

It's speak to the basic understainding og the topic..if you don't get, you will make false and uninformed clams.
This thread is full of them...go take a look.



I've seen all those effects implemented to one degree or another in games using physics on the CPU. The reason you don't see any of that in Mafia 2 when you disable physx is because they gutted any physics implementation once you disable physx.

You can't disable PhysX, enoguh with the false claims okay?!
Even with APEX set to OFF...the physics enigne running in the game is still...TA-DA...PhysX

BTW, why don't Intel show off Havok physics running on CPU's?
If Havok is so good and PhysX so bad, iot would be so simple right?

Or is that even Intel knows that you need more than CPU's..and why they got a hold of Havok...to show of, not their CPU's...but "LarraBee".
Or why don't AMD do more than just point-and-cry-faul.
They make CPU's.
They work with bullet know.
All it take to prove these claims...are a single techdemo.

/sing "Enjoy the silence..."



What does this have to do with anything ?

Physx has been around for over 4 years. GPU physx is averaging 2 games a year. Why is it not being adopted more widely ?

Because 1/3 of the market would whine like there iis no tomorrow...is loud enoguh already...imagine if game required a GPU for PhysX...the whine would be deafening.

In a game like Crysis back when it was released, when you booted it up that first time and put everything on very high in DX10 with some AA and got poor framerates even on the best hardware, but what you did see was incredibly impressive, amazing visuals. One could accept the poor performance because the visual immersion was amazing.

And it got broke a lot of places by very bad physics:
http://www.youtube.com/watch?v=gUJQPBDnrC8
It's not enough things look real..they must act real too.

When you load up Mafia 2 with physx on high and get poor performance it makes no sense, because there is nothing amazing going on and it is not bringing a level of immersion you have not seen before.

Look at the videos I just posted above and make that claim again.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I dont think Nvidia will have a problem supporting a standard whether it is in OpenCL or a Microsoft standard. AMD on the other hand....................

NVIDIA's OpenCL support is way ahaed of AMD's.
But the again CUDA is lightyears ahead of OpenCL.
So why should NVIDIA castrate PhysX by running it on OpenCL?

My guess is when AMD launches (or if ever) OpenCL GPU physics, NVIDIA will release their OpenCL PhysX and make AMD look way behind the curve...again.

But that is just my personal hunch, I have no evidence or information to back it up...only time will tell.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
So nothing to do with PhysX at all?
Just checking because I was confused, since this thread is about PhysX, and NV recently said there was no reason for them to port PhysX to OpenCL.


If you want to be consistant, you need to post the same everytime a false (and debunked) claim is made against PhysX..otherwise it makes you looks biased with an agenda...but no facts to support it.

And i am amazed how bias can make people wnat to stop progress at all cost.
The lowset common denominator dosn't work...communsim proved that.
Don't drag it into hardware.

The fact is:
A LOT of false claims are being made against PhysX...they have been debunked, but still they won't die.

It's like arguing with people that think their CPU can du the same level/FPS of graphics as a GPU...it's retarded.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
As far as PhysX being deliberately crippled on the CPU, of course it is. You'll never encounter more easy to make multi-threaded code than PhysX code(they have no problem running it on hundreds of cores in the GPU) yet they force it to run on just one thread even on 12 thread CPUs. PhysX performance on the CPU could be increased by hundreds of percent if that is what Nvidia's goal was.

*Sigh*...
For the umpteenth time: nVidia does NOT force PhysX to run on one thread. The entire library is thread-safe. In other words: nVidia made sure that you can run PhysX on multiple threads.
However, currently the management of threads has to be done by the developer, because there is no automated thread-management in PhysX yet (just like in Direct3D, OpenGL and pretty much every other library that game developers might use). There are various examples of multithreaded PhysX implementations, such as 3DMark Vantage or FluidMark. Why most developers didn't bother to make a multithreaded PhysX implementation, I don't know... But fact remains that nVidia did not prevent them from doing it. On the contrary.
nVidia has also promised automated multithreading in the upcoming PhysX 3.0 SDK.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Any particular reason why that's your timeline?
Bit name support came pre-NV (through Unreal Engine integration).

Because hardware acceleration was going to be placed on the GPU from the PPU. It wasn't about hardware accelerated PPU's anymore.

My zest for GPU Physics started here from ATI and nVidia from the humble beginnings of the potential of HavokFX.

http://www.neoseeker.com/Articles/Hardware/Previews/havokfx-nvidia/

http://www.firingsquad.com/news/newsarticle.asp?searchid=10649