Changed opinion on hardware accelerated physics

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I've been a very strong proponent of GPU accelerated physics over the years, and I've gotten into many debates and arguments with tons of people on this forum and others about the subject..

Recently something happened though which caused me to change my favorable opinion towards GPU accelerated physics, in favor of good old CPU physics. Metro Last Light Redux is what caused this shift in opinion. Unlike the original which used PhysX 2.8x, Metro Last Light Redux uses PhysX 3.3 and runs EXTREMELY well on the CPU.. In fact, I am shocked at how optimized PhysX 3.3 is for the CPU. It takes great advantage of not only multithreaded processors, but also SIMD extensions like SSE4 and AVX.

The multithreading and SIMD support speed it up tremendously, and it processes the extra physics effects like debris, destruction, cloth, fog etcetera from enabling the Advanced PhysX option in the menu without a hitch or any slow downs; at least on my rig..

This is an indication of future trends as NVidia and Intel aggressively optimize their physics middleware solutions for the latest SIMD extensions; and as CPUs get fatter cores, wider vectors, more cores and threads, I don't see CPUs having any problems delivering more immersive and complex physics in future games..

The Witcher 3 and Batman Arkham Knight will be the games to watch next year, as both will use the latest software PhysX heavily. Even from the footage that we've seen so far, the Witcher 3's physics looks very impressive and it's all running on the CPU.

I never used to believe the CPU could even come close to matching a GPU in physics, but now I think otherwise! Of course there's always going to be somethings where the GPU will just always be better as it just has too much of a raw processing power advantage. Full real time processing of demanding effects such as fog, smoke, fluid etcetera will likely remain dependent on the GPU for some time.

On the other hand, both PhysX and Havok are employing some sophisticated partial or approximated simulations for these demanding effects that look just as good as fully processed real time simulations on the GPU, though they're not as interactive. I was especially impressed with the ambient mist and smoke effects in Metro Last Light Redux, and those were approximations as far as I could tell.

Anyway, I can't wait to see the evolution of both PhysX and Havok. At the moment, it seems Physx has the advantage because I have yet to see a game that uses Havok equal the software PhysX found in Metro Last Light Redux.. Perhaps AC Unity will be the first?
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
All these games that use PhysX....I mostly end up turning it off! On SLI when you can only assign the GPU accelerated to one GPU or the other it is just a cause of microstutter. So you either assign it to the CPU or reduce the quality of it as if you didn't have a PhysX card.

But my experience of PhysX has basically been it either makes no useful difference or it crashes frame rates. In either case I tend to not use it. So in the last couple of years I simply haven't bothered with it much, I turn it off quite often.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
It wouldn't surprise me if the good CPU performance could of been implemented long ago but wasn't to try and push GPU physx. It's not a feature that has seen much traction and has a lot of issues.

SLI is terrible with GPU physx. You either disable SLI and dedicate a GPU, losing the benefit of SLI, or you leave SLI enabled and it causes GPU usage to tank on your cards. Borderlands 2 is the worst offender for this. The game is pretty light weight graphically, but if you turn on GPU physx it destroys your framerate. If I try running gpu physx on my setup I get massive slowdowns in that game. Doesn't add up for for how easy to run the game is with it disabled and how powerful my main gaming rig is.

I actually think Metro: LL is the best implementation of gpu physx I've seen. It's not a lot of overdone effects like in BL and the effects they do implement are impressive. The fog and its interaction with light and the environment is really cool. It also is not a performance killer in that game.

I don't think Witcher 3 will run super well just using the CPU for the physx effects. If you've seen some of the demos they've shown of the game with the fur effects it becomes clear you're going to have to use a gpu if you want to enable those effects.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I noticed that the redux version of Metro 2033 doesn't switch hardware PhysX on even with advanced PhysX ticked. This is something Nvidia have confirmed for me that they are looking into for the next drivers.

When I compared the game to the original there were definitely elements missing, like the debris that gets caught up in the anomaly during the 1st cart ride. I'll post a video when it's finished uploading.
 
Last edited:

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
software is the way of the future. can't wait until fixed function hardware-based rendering dies out even if i won't have the means to own the latest and greatest in the future due to my parents ignorance, and their stubborn stinginess.
 

Reticula

Junior Member
Sep 15, 2010
15
0
0
physxinfo.com
Well, what else would you expect ?
Back then Mafia II has nice GPU cloth physics, but now, let's say Lords of the Fallen has much more impressive multi-layred cloth running through CPU PhysX even on consoles.

The basic (software) level of both gameplay and effects physics is shifting, rising.

But there is still a lot of room for GPU accelerated stuff. Volumetric smoke simulation using high-detalized grids (> 512x512), not few SPH particles and sprite clouds, like in Metro. Massive fluid simulation with millions of particles, not thousands, like in Borderlands 2. Strand-based hair, fur and grass simulation. And so on..
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Hi Reticula/Zogrim

Mafia 2's Cloth was done on the CPU with single GPU platforms though.
 
Last edited:

Wall Street

Senior member
Mar 28, 2012
691
44
91
http://arstechnica.com/gaming/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library-to-spite-intel/

PhysX 3.x is so great on the CPU whereas PhyX 2.x was garbage because a ton of sites outed nVidia for using x87 and lacking multithreading on the old CPU code. Now that games are designed around PhysX 3.0 that supports multithreading and SSE/AVX the difference between CPU and GPU PhysX has shrunk dramatically. Who would have thunk it?

I may be a bit too cynical, but to me it was pretty clear that nVidia was intentionally not offering any performance improvements for CPU PhysX until they were called out.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
GPU physics is a good idea. Problem is that Nvidia is the only one that offers it, so it doesn't make much sense for game developers to implement it. If Nvidia would license PhysX to AMD it may be different, but today's CPU's are more than capable of handling it so not much point in pushing GPU physics hard anymore.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
http://arstechnica.com/gaming/2010/07/did-nvidia-cripple-its-cpu-gaming-physics-library-to-spite-intel/

PhysX 3.x is so great on the CPU whereas PhyX 2.x was garbage because a ton of sites outed nVidia for using x87 and lacking multithreading on the old CPU code. Now that games are designed around PhysX 3.0 that supports multithreading and SSE/AVX the difference between CPU and GPU PhysX has shrunk dramatically. Who would have thunk it?

I may be a bit too cynical, but to me it was pretty clear that nVidia was intentionally not offering any performance improvements for CPU PhysX until they were called out.

The old code wasn't as good, but it did not prevent multithreading either. It was just left at single threading by default, and required the dev's to enable it, which many did not.

While many new games do seem to not have the same penalty when enabling PhysX, I believe that is large a result of dev's not using as many effects. Metro games have used PhysX vary sparingly, for example, Borderlands 2 used it a bit more, and for good FPS, did need a PhysX card for good results. Then you have Batman games, which need a dedicated card for their highest settings, and even then, it doesn't work great.
 

HeXen

Diamond Member
Dec 13, 2009
7,837
38
91
Can you assign PhysX to use the IGPU? ..if not then why? It seems like perfectly good hardware that tends to get little use.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Can you assign PhysX to use the IGPU? ..if not then why? It seems like perfectly good hardware that tends to get little use.

No, you can't. It only has 2 code paths; CUDA and CPU. All GPU/IGPU's have to use CUDA to be used, and none do as it is an Nvidia only feature.

Of course, Nvidia could recode it to work, they just don't.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
The old code wasn't as good, but it did not prevent multithreading either. It was just left at single threading by default, and required the dev's to enable it, which many did not.

While many new games do seem to not have the same penalty when enabling PhysX, I believe that is large a result of dev's not using as many effects. Metro games have used PhysX vary sparingly, for example, Borderlands 2 used it a bit more, and for good FPS, did need a PhysX card for good results. Then you have Batman games, which need a dedicated card for their highest settings, and even then, it doesn't work great.

Again being clinical here, nVidia offered developers "incentive" for TWIWMTBP games to use PhysX and didn't offer "incentive" to implement it multithreaded. My understanding is that engineers from AMD and nVidia even help game makers implement the features by providing code. So I won't let nVidia off the hook too easily for this one.

Also x87 completes one operation per clock where SSE does four and most people have four cores. I wouldn't underestimate how much this 16x speed up impacts performance vs the more sparing use of effects.
 

mindbomb

Senior member
May 30, 2013
363
0
0
All these games that use PhysX....I mostly end up turning it off! On SLI when you can only assign the GPU accelerated to one GPU or the other it is just a cause of microstutter. So you either assign it to the CPU or reduce the quality of it as if you didn't have a PhysX card.

But my experience of PhysX has basically been it either makes no useful difference or it crashes frame rates. In either case I tend to not use it. So in the last couple of years I simply haven't bothered with it much, I turn it off quite often.

this is specifically because of a non-aggressive implementation of physx in games. Most games, when you turn on physx, it doesn't disable the cpu physx, it just enables additional gpu physx, and the results of both physx engines have to be synced up.

I think someone could make a game that actually only used gpu physx that would be impressive, but then the game would require an nvidia card to play.
 

Reticula

Junior Member
Sep 15, 2010
15
0
0
physxinfo.com
Hi Reticula/Zogrim

Mafia 2's Cloth was done on the CPU with single GPU platforms though.
Hi, SirPauly)

Yeah, I remember that - and CPU execution was painfully slow.

Also x87 completes one operation per clock where SSE does four and most people have four cores. I wouldn't underestimate how much this 16x speed up impacts performance vs the more sparing use of effects.
And you seem to overestimate it.. when we did some rough benchmarks on PhysX 2.8.3 vs PhysX 2.8.4 (compiled with SSE2) back in 2010, the performance gain was about 15 %

PhysX 3.x is so great on the CPU whereas PhyX 2.x was garbage because a ton of sites outed nVidia for using x87 and lacking multithreading on the old CPU code.
I don't think "tons of sites" are a merit here, usually customers (as PhysX is commercial engine) are the rule here. In particular, majority of PhysX 3 CPU optimizations were implemented to make it competitive on consoles.
Also, to date, PhysX 2.4 - 2.8.3 was used in 200+ games running purely on CPU, don't fit into this "omg garbage" fuss.

As for GPU PhysX effects, look at them as a-priori NV-only bonus, similar to platform exclusives on consoles. There is no technical difficulties with releasing, let's say, Last of Us Remastered on Xbox One and PC, but you need PS4 to play it.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
this is specifically because of a non-aggressive implementation of physx in games. Most games, when you turn on physx, it doesn't disable the cpu physx, it just enables additional gpu physx, and the results of both physx engines have to be synced up.

I think it's actually because GPU PhysX only gets calculated on one card, you can see the difference in GPUz. This means that one card will finish all calculations quicker than the other so there is a slight difference in latency when the cards send the image to the screen. Hence why microstutter doesn't occur when using a single GPU.

I never used to have a problem with just disabling SLI and running PhysX on one card with the rest of the game on the other, but I was playing at 1680x1050 so not as many pixels as 2560x1440p.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
And you seem to overestimate it.. when we did some rough benchmarks on PhysX 2.8.3 vs PhysX 2.8.4 (compiled with SSE2) back in 2010, the performance gain was about 15 %

There is a difference between compiling native x87 code with the compiler optimizing for SSE at compile time vs. native SSE code. Also, I don't think that 2.8.4 enabled multicore for a lot of games. When they went back to rewrite the code to use vector instructions by default, they ended up with PhysX 3.0, which is much much faster. I will admit though that the 16x speedup only applies to a few edge cases, but an sticking by the fact that the PhysX 2.0 code base was pretty bad on the CPU.
 

Reticula

Junior Member
Sep 15, 2010
15
0
0
physxinfo.com
but an sticking by the fact that the PhysX 2.0 code base was pretty bad on the CPU.
PhysX 2.0 probably (it was called NovodeX back then), but 2.8.x branches are not that bad.
Last year, Pierre Terdiman did a very detailed CPU performance research on various engine features (rigid bodies, joints, raycasts, etc) with a number of engines engines - PhysX 2.8.4, PhysX 3.x and Bullet 2.81
http://www.codercorner.com/blog/?p=914 (index post)

As you may see, PhysX 2.8.4 is even faster than Bullet in many cases.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
In Metro Last Light (not redux) I had enormous performance problems with PhysX and it turned out to be the PhysX setting that was recommended by the game. Switching that down immediately fixed the FPS and stutter that the game had. I ended up turning it off in Borderlands 2 as well but I think I left it on in Batman. Its one of those technologies that sometimes gives really cool effects and looks but as every has likely established I really hate stutter and PhysX for me at least seems to tank minimum performance in the games its installed into, Nvidia seems to have a lower standard for FPS than me.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
GPU physx is good for situations where there is an excess amount of GPU power. Optimized particle, smoke, clothing, and hair effects will always run better on GPU's than on CPU's. In games that do not fully utilize all the CPU cores, it's best to offload some of the physics to the CPU cores first. But what's best and what happens are often two different things.
 
Last edited:

HeXen

Diamond Member
Dec 13, 2009
7,837
38
91
No, you can't. It only has 2 code paths; CUDA and CPU. All GPU/IGPU's have to use CUDA to be used, and none do as it is an Nvidia only feature.

Of course, Nvidia could recode it to work, they just don't.

Well wouldn't that be the superior performance option, to use IGPU for physX?
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
In Metro Last Light (not redux) I had enormous performance problems with PhysX and it turned out to be the PhysX setting that was recommended by the game. Switching that down immediately fixed the FPS and stutter that the game had. I ended up turning it off in Borderlands 2 as well but I think I left it on in Batman. Its one of those technologies that sometimes gives really cool effects and looks but as every has likely established I really hate stutter and PhysX for me at least seems to tank minimum performance in the games its installed into, Nvidia seems to have a lower standard for FPS than me.

I think you would be better off disabling SLI if you wanted to run heavy GPU Physx
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
SLI is terrible with GPU physx. You either disable SLI and dedicate a GPU, losing the benefit of SLI, or you leave SLI enabled and it causes GPU usage to tank on your cards. Borderlands 2 is the worst offender for this. The game is pretty light weight graphically, but if you turn on GPU physx it destroys your framerate. If I try running gpu physx on my setup I get massive slowdowns in that game. Doesn't add up for for how easy to run the game is with it disabled and how powerful my main gaming rig is.

From my own research, I concluded that the issue with PhysX in BL2 is game related, and not because of PhysX itself.. The massive amounts of particles and objects being generated as a consequence of having PhysX turned on, overwhelms the renderer; which is single threaded and based on DX9 so the overhead is very high.

For a DX9 Unreal Engine 3 game, BL2 is surprisingly detailed.

I don't think Witcher 3 will run super well just using the CPU for the physx effects. If you've seen some of the demos they've shown of the game with the fur effects it becomes clear you're going to have to use a gpu if you want to enable those effects.

The fur and hair effects use DirectCompute, so that will definitely run on the GPU. Everything else though including cloth, destruction, fluid etcetera will likely run on the CPU..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I noticed that the redux version of Metro 2033 doesn't switch hardware PhysX on even with advanced PhysX ticked. This is something Nvidia have confirmed for me that they are looking into for the next drivers.

I'll have to check that out later myself and watch your video. I played some of Metro 2033 redux and I didn't notice anything out of the ordinary..