Changed opinion on hardware accelerated physics

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Well, what else would you expect ?
Back then Mafia II has nice GPU cloth physics, but now, let's say Lords of the Fallen has much more impressive multi-layred cloth running through CPU PhysX even on consoles.

I guess that after years and years of PhysX 2.xx running so well on the GPU but poorly on the CPU, I sort of accepted that PhysX would never be able to deliver advanced physics effects on the CPU.

But of course, that's not the case, as PhysX 3.3 admirably demonstrates.

But there is still a lot of room for GPU accelerated stuff. Volumetric smoke simulation using high-detalized grids (> 512x512), not few SPH particles and sprite clouds, like in Metro. Massive fluid simulation with millions of particles, not thousands, like in Borderlands 2. Strand-based hair, fur and grass simulation. And so on..

Yep, I said as much in my OP. But with 512 bit vectors coming down the pipeline, I wonder how long it will be before the CPU is able to handle these as well..
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
I'm pro-GPU physx simply for the fact that it gives us a way to scale performance across multiple GPUs without SLI and all that entails - microstutter, matched cards, etc.

I personally keep a 750 ti as a physx card alongside my 970, and while it doesnt completely cure the framerate issues, it's always faster than a single card can be on it's own. In fact, in my testing, it's rarely the physx calculations itself that's the limiting factor, it's the rendering - all the particle interactions might be handled by the dedicated card, but the main card still needs to render them. And even then it's not GPU limited, it's CPU limited...because too many games just arent threaded well enough. When I enable high physx in AC IV, I can watch my first GPU drop to 80% usage, the physx card hover around 50%....and the load on a single core of the CPU lock to 100%, while the rest barely break 20%....and the frame rate drops to 50ish. It's that one damn CPU thread, and this is on a 4.6ghz 4790K...so there is literally no modern CPU in the world that can run AC IV locked to 60fps with physX on high. Either way - without the physx card, I'm in the mid 30s with physx on high...so there's still a clear benefit, and honestly I doubt that even if the other 3 cores on the CPU were used to their maximum, that it'd be able to match the GPU accelerated effects.

If the new SDK is better multhreaded, I hope that applies to any CPU assistance of the GPU effects. It's just something I really don't want to see go away - SLI is fraught with so many issues, load balancing across GPUs without alternate frame rendering is a better idea in just about every way.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
The legacy software PhysX is fine and produces pretty good effects for very little performance hit.

Hardware PhysX OTOH is utter crap and it's the first thing I disable in games and in the control panel. The performance hits are absolutely ridiculous and in many cases the IQ gain is miniscule to non-existent.

The worst part is the vendor lock where paying nVidia customers endure reduced functionality when competitors are detected in the system. But we're told this is somehow okay because of the "extra testing" required.

So I guess those same people would be okay with Intel and AMD disabling the PCIe slots on their chipsets when nVidia cards are detected? After all, that requires "extra testing" too.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
Well wouldn't that be the superior performance option, to use IGPU for physX?
Perhaps from a strict resource management direction. But remember, both AMD and Intel are direct competitors in the graphics market. PhysX is a part of nvidia's special sauce that is supposed to make you decide to either choose an nvidia dgpu over an amd/ati, or to shell out extra for that gaming notebook with discrete graphics over an igpu only. If nvidia made CPUs, I'd bet dollars-to-donuts that PhysX would only run on CUDA or cpus that returned "Vertiable nVidia" or whatever :p
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Just my GPU Physx experiences:

I run a 750ti for dedicated physx.

The Batman games run great with GPU Physx and they actually add quite a lot to the experience. More than any other games I've played.

Metro 2033 original does work with slow downs. It's a huge hassle to get it to work without totally crashing framerates. Turning it off was just the easiest thing to do. Also I didn't notice what effects it added. It seemed to crash framerates even in frames it wasn't doing anything.

AC4 was buggy like crazy with Physx but some recent patch which came out almost 9 months after the game's release allows me to run it at "normal" physx setting with minimal drops in performance. In fact that game overall has seen improvements in performance and framerate stability only it's taken them months to do it. The problem is that the only effect I noticed was thicker smoke. It looks cool and all but frankly big deal. The feature was an afterthought at best. And during ship battles the smoke with Physx on is a bit excessive and makes the game harder to play.

Overall the Batman games use it best from what I've seen and I'm still waiting for this technology to make bigger differences in immersion.
 

NTMBK

Lifer
Nov 14, 2011
10,240
5,026
136
Someone seriously needs to get physics running on the IGP. Almost every modern gaming build has an OpenCL capable GPU integrated into the processor, sitting idle when it could be accelerating physics calculations.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
Just my GPU Physx experiences:

I run a 750ti for dedicated physx.

The Batman games run great with GPU Physx and they actually add quite a lot to the experience. More than any other games I've played.

Metro 2033 original does work with slow downs. It's a huge hassle to get it to work without totally crashing framerates. Turning it off was just the easiest thing to do. Also I didn't notice what effects it added. It seemed to crash framerates even in frames it wasn't doing anything.

AC4 was buggy like crazy with Physx but some recent patch which came out almost 9 months after the game's release allows me to run it at "normal" physx setting with minimal drops in performance. In fact that game overall has seen improvements in performance and framerate stability only it's taken them months to do it. The problem is that the only effect I noticed was thicker smoke. It looks cool and all but frankly big deal. The feature was an afterthought at best. And during ship battles the smoke with Physx on is a bit excessive and makes the game harder to play.

Overall the Batman games use it best from what I've seen and I'm still waiting for this technology to make bigger differences in immersion.

Yeah, the problem has never been the tech, it's been the implementation. Only borderlands, metro and batman have really used it well. Other than that, it's been a tacked on afterthought used to such extremes or with so little optimization that it does more harm than good.

In ACIV literally the only thing that changes is the smoke - except this is smoke you can run through and casts shadows. As cool as that may be, the engine simply can't handle the increased CPU burden, and they never bothered to take the time to tune the number of particles to something it could handle. Setting it to medium just reduces the places where they replace the smoke - on medium its just smoke bombs and guns etc, on high it's like every single chimney...so you can't talk a walk throughout town without some chimney in the far off distance spiking the CPU and crashing performance. It's absolutely insane, but what can you expect? The only reason it seems to be in there is due to some co-marketing deal with nvidia - it looks meh and performs terrible.

The only way most devs will ever use this with care is if the consoles support it, and since they're both AMD....I guess we'll see? GPU accelerated physics is still a good idea, but there are unfortunate market forces preventing it from being as big of a deal as it should be.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Just my GPU Physx experiences:

I run a 750ti for dedicated physx.

The Batman games run great with GPU Physx and they actually add quite a lot to the experience. More than any other games I've played.

Metro 2033 original does work with slow downs. It's a huge hassle to get it to work without totally crashing framerates. Turning it off was just the easiest thing to do. Also I didn't notice what effects it added. It seemed to crash framerates even in frames it wasn't doing anything.

Batman games do look good with it.

Metro 2033 pretty much used it for real time physics, rather than scripted physics. Since real time vs scripted hardly looked different to me, I leave it off.

Sacred 2 wasn't bad, though I wouldn't call it important to use either.
 

HeXen

Diamond Member
Dec 13, 2009
7,831
37
91
Didn't Cryphyics ...the physics engine in Cryengine 2 also use the GPU? Shame most games don't have such destructible environments like Crysis 1 did, looked really good. I liked how the trees moved when you tossed a hand grenade.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
In crisis 3 I know a lot of the physics was done on the CPU. One way we could tell is a particular scene where there was really long grass maxed out all the cores and performance in that area was tied to the CPU performance and based on the number of cores you had. I don't know about the earlier crysis games but the last one used a lot of CPU physics.