Are PHYSX games the only PC games using GPU-driven physics?

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
It seems like anytime a PC game has GPU-driven physics it's always the proprietary Nvidia tech. Is that mostly because of the hardware landscape on PC? ie. all the possible different configurations. It would be too time consuming to code in GPU physics that can run on AMD's spu's, and then create ones that can run on PHYSX as well. I can see why developers wouldn't have the budget to spend time on both.

Are there any games that use PHYSX-style physics but that run on AMD's spu's?

Maybe the new wave of consoles will usher this in?
 
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
Back when PhysX was independent and selling dedicated cards, Nvidia realized they could corner the physics market by buying PhyxX and integrating it into their hardware, and that's exactly what they did. The fact that you are curious about this only reinforces that it was a good business move by Nvidia. Unless AMD licenses PhysX from Nvidia, it's unlikely you will see hardware physics on any AMD GPU unless they do their own version.

I think AMD initially saw PhysX as a fad and didn't think they it would become relevant in mainstream gaming.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
And they were mostly right.

Mostly from a technical standpoint. From a marketing perspective, it is highly relevant. I personally would never buy a GPU based on whether it has PhysX, but it's yet another thing they can put on the box to lure in less knowledgeable customers.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Unless AMD licenses PhysX from Nvidia, it's unlikely you will see hardware physics on any AMD GPU unless they do their own version.

Won't they have to, with the new consoles? How else are they going to do the physics? I don't think the cpu's are up to the task to push advanced physics engines. Unless they are planning on actually not advancing physics much and sticking to just "eye candy" (that would be a shame as IMO physics provide a better sense of immersion in a game world than eye candy does but I digress...)
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
Won't they have to, with the new consoles? How else are they going to do the physics? I don't think the cpu's are up to the task to push advanced physics engines. Unless they are planning on actually not advancing physics much and sticking to just "eye candy" (that would be a shame as IMO physics provide a better sense of immersion in a game world than eye candy does but I digress...)

No idea. I do know that the XB1 and PS4 use Havok for physics processing, which is middleware that I believe uses the GPU but it isn't the same thing as a driver level dedicated solution such as PhysX.

http://en.wikipedia.org/wiki/Havok_(software)

http://www.havok.com/customer-projects/games
 

Elixer

Lifer
May 7, 2002
10,371
762
126
PhysX has been butchered by nvidia for their own needs. Unless the product in question is getting paid by nvidia, then, they wouldn't use it.

There are much better and platform neutral physics engines out there, like Havok, Bullet, vortex, (and...) and they run on pretty much anything.
They even have ones that detect the GPU, and run openCL so it can be GPU driven.

All this don't play well with nvidia, since, they only want to support physx.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
PhysX has been butchered by nvidia for their own needs. Unless the product in question is getting paid by nvidia, then, they wouldn't use it.

There are much better and platform neutral physics engines out there, like Havok, Bullet, vortex, (and...) and they run on pretty much anything.
They even have ones that detect the GPU, and run openCL so it can be GPU driven.

All this don't play well with nvidia, since, they only want to support physx.


What games does this?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It's not always about getting paid. There's a good amount of support you get from the partnerships like TWIMTBP.

I remember way back when Half-Life 2 was being hyped by ATI and they claimed it could do hardware accelerated physics. It was even printed on the box for their video card. The card never did hardware accelerated physics.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm not sure if any games use hardware accelerated physics except if it is using nvidia physx.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Didn't Intel kill Havok GPU Accelerated Physics when they bought Havok? If it's running on the AMD GPU inside the consoles then it shouldn't be too hard for them to do so on PC's.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
Didn't Intel kill Havok GPU Accelerated Physics when they bought Havok? If it's running on the AMD GPU inside the consoles then it shouldn't be too hard for them to do so on PC's.

Intel fully supports openCL these days...
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
I wonder how well those old PPUs still work with modern games.

I'm pretty sure PhysX is the only hardware accelerated engine unless Havok is using OpenCL. Though some games do support CPU based a PhysX engine. Which is what the consoles use. The problem with it being a proprietary technology is that most gaming systems (I'm including consoles) don't use nVidia GPUs.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
I wonder how well those old PPUs still work with modern games.

I'm pretty sure PhysX is the only hardware accelerated engine unless Havok is using OpenCL. Though some games do support CPU based a PhysX engine. Which is what the consoles use. The problem with it being a proprietary technology is that most gaming systems (I'm including consoles) don't use nVidia GPUs.

Yep. Solutions like Havok are openly described as middleware. They do use the GPU for processing physics, but it's not low level like PhysX. It's basically the same software physics processing they would do on the CPU, except the software executes it on the GPU thus freeing up the CPU for other things. The downside is that it isn't as efficient as the PhysX implementation and it robs the GPU of power that it could be using for graphics. Whether the performance difference is noticeable is beyond my knowledge of this.

Havok's solution divides the physics simulation into effect and gameplay physics, with effect physics being offloaded (if possible) to the GPU as Shader Model 3.0 instructions and gameplay physics being processed on the CPU as normal. The important distinction between the two is that effect physics do not affect gameplay (dust or small debris from an explosion, for example); the vast majority of physics operations are still performed in software. This approach differs significantly from the PhysX SDK, which moves all calculations to the PhysX card if it is present.

http://en.wikipedia.org/wiki/Physics_processing_unit
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
PhysX has been butchered by nvidia for their own needs. Unless the product in question is getting paid by nvidia, then, they wouldn't use it.

What a load of nonsense :rolleyes: PhysX is far better now than it ever was when it was owned by Novodex or Ageia..

And provide a source to back up your assertion that Nvidia pays developers to use PhysX. Thats the most ridiculous thing I've ever heard!

There are much better and platform neutral physics engines out there, like Havok, Bullet, vortex, (and...) and they run on pretty much anything.

Funny you should say this, because PhysX pretty much runs on anything as well.. And if PhysX was so crappy, why did CDPR (the developer of the Witcher games) drop Havok in favor of PhysX for the Witcher 3?

They even have ones that detect the GPU, and run openCL so it can be GPU driven.

This is news to me. As far as I know, PhysX is in the only physics middleware solution that uses the GPU. Havok and Bullet only use the CPU, and do not run on OpenCL..

All this don't play well with nvidia, since, they only want to support physx.

More nonsense. Parts of PhysX uses DirectCompute to run, and Nvidia are in the process of porting the entire PhysX library over to DirectCompute..
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Didn't Intel kill Havok GPU Accelerated Physics when they bought Havok? If it's running on the AMD GPU inside the consoles then it shouldn't be too hard for them to do so on PC's.

Yeah, that's the other side to it. Intel may not have completely killed GPU physics with Havok, but they certainly took the focus away from it. The one-two punch of Nvidia making GPU-accelerated PhysX hardware-exclusive and Intel making Havok prioritize CPU physics way more than GPU physics ensured that GPU-accelerated physics would not become a workable open standard. Developers won't use it to improve actual gameplay, it's just a fancy effect that one can take or leave with no consequence. It's a shame, because GPU-driven physics in game environments could really be used to make interesting mechanics.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
I spoke to a bunch of game devs about this, and it was pretty much unanimous - there isnt a critical mass of OpenCL capable cards/CPUs powerful enough for them to bother moving physics to the GPU (or do much of anything worthwhile with GPGPU yet). And OpenCL still isnt where they want it to be yet anyway.

So on the one hand, the proprietary nature of PhysX is slowing it down, but on the other hand...if not for physX, it probably wouldn't exist at all.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
I spoke to a bunch of game devs about this, and it was pretty much unanimous - there isnt a critical mass of OpenCL capable cards/CPUs powerful enough for them to bother moving physics to the GPU (or do much of anything worthwhile with GPGPU yet). And OpenCL still isnt where they want it to be yet anyway.

So on the one hand, the proprietary nature of PhysX is slowing it down, but on the other hand...if not for physX, it probably wouldn't exist at all.

The nature of OpenCL is very apparent to anybody who's tried using it for actual work. I have yet to come across a GPGPU video encoder that doesn't suck. Doesn't matter it it's paid or free. That's despite the fact that companies like Apple were really pushing it. I remember OpenCL support being a big feature touted in OS X Snow Leopard. It just never went anywhere.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
so physics will probably not be advancing too much? with the next console generation


They already are (Resogun on ps4), but it's not going to be widespread in multiplatform/PC games until the 360/PS3 is no longer relevant, and the the tools/APIs have matured. It'll be a few years.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
The nature of OpenCL is very apparent to anybody who's tried using it for actual work. I have yet to come across a GPGPU video encoder that doesn't suck. Doesn't matter it it's paid or free. That's despite the fact that companies like Apple were really pushing it. I remember OpenCL support being a big feature touted in OS X Snow Leopard. It just never went anywhere.


Yeah, but like a lot of these things, it takes someone to push it in it's early days, or it'll never go anywhere. It's a huge change that requires hardware support and a whole new way of doing things. It took years for devs to really take advantage of GPU shaders - it wasn't until dx9 and unified shaders combined with a critical mass of Xbox 360s that we saw games require them.