How much of a role will physx play

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
Plenty of games are still coming out with PhysX like Dark Void and Metro 2033.

I think it will be a very prominent feature in games from now on.

Not in any games that actually uses physics as part of the gameplay experience like BFBC2.

Keep up the good work you NVIDIA drones love to make my job easier.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Plenty of games are still coming out with PhysX like Dark Void and Metro 2033.

I think it will be a very prominent feature in games from now on.

I'm pretty sure all the developers are weeping that they can't implement cool fluttering paper and smoke effects or other particle effects in games like they can with PhysX that work only with nVidia cards. They'll just have to make do with similar particle effects that work on all systems by using Havok.

Oh...and isn't it awesome how Havok is used in game changing ways such as destructible environments? I'm totally excited at the prospects of how Havok will be used in the future to help enhance gameplay rather than add some visual fluff that some don't even notice.


EDIT:
Seems to not support strikethrough text. I originally put PhysX in striked text before Havok in the second paragraph.
 

Phil1977

Senior member
Dec 8, 2009
228
0
0
Ever since I have seen Battlefield Bad Company 2, I have no hope for PhysX.

PhysX is candy sprinkled over the top. The physics in Bad Company 2 are part of the engine and are part of the game play.

I LOLed when I saw PhysX in Mirrors Edge. Glass breaking? Paper and dust blown away from a chopper? That's it?

Man Bad Company 2 is full of physics effects. In the desert level dust and smoke and paper and other bits flying around. Fully destructible buildings, fences, walls and other bits.

PhysX will remain a marketing tool, but the real physics breakthrough will be through game engines and run on the cpu.

Even Crysos (several years old now) has better physics effects than any PhysX game.
 

crazylegs

Senior member
Sep 30, 2005
779
0
71
Ever since I have seen Battlefield Bad Company 2, I have no hope for PhysX.

PhysX is candy sprinkled over the top. The physics in Bad Company 2 are part of the engine and are part of the game play.

I LOLed when I saw PhysX in Mirrors Edge. Glass breaking? Paper and dust blown away from a chopper? That's it?

Man Bad Company 2 is full of physics effects. In the desert level dust and smoke and paper and other bits flying around. Fully destructible buildings, fences, walls and other bits.

PhysX will remain a marketing tool, but the real physics breakthrough will be through game engines and run on the cpu.

Even Crysos (several years old now) has better physics effects than any PhysX game.

QFT.

BFBCC2 is an awesome multiplayer game, that uses physics effects to (this is a really important point for all you NV fanbois) ACTUALLY ADD TO GAMEPLAY :O :O :O

and... get this... ITS NOT VENDOR SPECIFIC :O :O :O

Who would have thought of doing something so utterly crazy in a computer game.

I hope Physx dies a quick painless death - along with any other closed system, i'm much happier with an open standard that actually works :)
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Funny thing about Phys-X that many overlook is that it's actually a quite suitable middle-ware for physics calculations on all games & platforms. A good example is Borderlands for PC on ATI hardware (looks great); while another is Phys-X titles on consoles without Nvidia GPU compute capability. Both of these scenarios are possible and work perfectly fine. Therefore "Phys-X" is still a good competitor to "HAVOK" and "Bullet" and any game creator's proprietary in-game physics system in the sense of it being a universal software physics platform.

Phys-X is Nvidia property and will always give a slant towards its own IHV. The real problem comes with games like Cryostasis, Batman:AA, Dark Void, and Metro2033 when the non-Nvidia user wants GPU calculated physics. When using Phys-X, this automatically commands the CPU to perform these calculations on a single core in an Nvidia designed compute path. Think about that for a second.
The end result could be fog covering a floor in the Batman AA intro, and average frame rates plummeting from 170 fps to 21fps maximum (ref. the Computerbase.de GTX480 review).

Controversy lies in Nvidia Phsy-X middleware instructions executed on the GPU, in which case Nvidia/Ageia is the only proper solution. Luckily for the masses, this is only possible in handful of 5 or 6 popular PC titles. Nvidia GPUs will take a sight performance hit enabling GPU rendered Phys-X eye-candy, and the end result is a playable game with awesome effects. However, any other platform attempting this without the required 8800 series or greater Nvidia GPU will under perform greatly.

That's the good and the bad. The ugly is when Nvidia or a developer decides exactly which effects in a game will fall into what category: Software Phys-X, Advanced Phys-X, or GPU-only Phys-X. Think TWITMTBP titles here. You can rest assured that Nvidia wants the most beautiful and obvious effects in the 'GPU only' category (regardless of whether or not it is possible to render in any other method). Just use your imagination to determine where things go from there.

Personally, I'm all for Phys-X, HAVOK, Bullet, or any other middle-ware standard that brings awesome physics calculations across all PC games. In the end, Nvidia wants Phys-X to be this champion platform. On the down side, having a proprietary system emerge the clear winner is going to be drawback for every one of us in the long run.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Physx has been a mess since the start, when nVdia aquired Aegia things got worse.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Ever since I have seen Battlefield Bad Company 2, I have no hope for PhysX.

PhysX is candy sprinkled over the top. The physics in Bad Company 2 are part of the engine and are part of the game play.

I LOLed when I saw PhysX in Mirrors Edge. Glass breaking? Paper and dust blown away from a chopper? That's it?

Man Bad Company 2 is full of physics effects. In the desert level dust and smoke and paper and other bits flying around. Fully destructible buildings, fences, walls and other bits.

PhysX will remain a marketing tool, but the real physics breakthrough will be through game engines and run on the cpu.

Even Crysos (several years old now) has better physics effects than any PhysX game.

Battlefield 2 is the pinnacle of scripted physics but realistic it ain't. Building break in pre-defined ways, smoke and dust aren't effected by what's going on around them. It's a bit like one of those lego games only with less obvious blocks.

The real hardware physics breakthrough will be when the PS4 and Xbox 720 support it via open cl/direct x compute. Will that hardware physics engine be physx? who knows- I am sure they will pretty quickly port it to those platforms when they arrive.

Anyway until then I agree hardware physx is just a nice extra for nvidia users (hence yes it is a marketing tool). Obviously you don't want to get mixed up with software physx which is the physics engine in a lot of games.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Not a big role at all if your CPU is adequete.. If your CPU bottlenecking then you need phsyc card. If your CPU is super fast no need for it the CPU will handle the task of phsyics no problem... thx:eek:
 

waffleironhead

Diamond Member
Aug 10, 2005
7,122
622
136
Not a big role at all if your CPU is adequete.. If your CPU bottlenecking then you need phsyc card. If your CPU is super fast no need for it the CPU will handle the task of phsyics no problem... thx:eek:

Who is going to be running a weak cpu on a dual gpu motherboard? At this point afaik you need a second gpu to really show off physx eye candy. SO it comes down to a new cpu or a sli board and new gpu.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Not a big role at all if your CPU is adequete.. If your CPU bottlenecking then you need phsyc card. If your CPU is super fast no need for it the CPU will handle the task of phsyics no problem... thx:eek:
do you know anything about physx though? if a game is using full hardware accelerated physx that is designed to run on the gpu then NO cpu can handle that smoothly. the only physx than can be run on the cpu is the low level effects that were made to run on the cpu.

now I am not saying that those effects could not be done with physics in general on a cpu but the way nvidia does its full hardware accelerated physx can only be done on the gpu. its clear that nvidia just gimps it where it cant be done on a cpu because most of the hardware accelerated physx effects are noting special.
 
Last edited:

ScorcherDarkly

Senior member
Aug 7, 2009
450
0
0
The low(ish) amount of games with PhysX effects (especially good ones) and the low impact of PhysX on gameplay is what keeps it from being a must-buy feature for me. At least before they started pulling monkey business with it, that just pissed me off to the point that I won't buy it until they clean up their act. Blocking PhysX from working when ATi hardware is detected in the system is dirty, and frankly stupid. If I could buy a PhysX dedicated card to run in concert with my main ATi-branded GPU, I certainly would. We know this kind of operation is possible, as people have hacked it to work. It just requires nVidia to get rid of their vendor block.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Physx is mainly a marketing bullet point for nvidia right now. It gives them something to talk about when they are 7 mos late to market and are getting a$$ pounded by amd. When the next gen releases and nvidia is again dominant they'll forget all about physx and/or open it up at the least.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
First, PhysX does run on CPU, meaning that ATI user can too enjoy things written in PhysX.

Second, no games use bullet, either PhysX and/or havok. The fact that ATI video card isn't designed to run PhysX but bullet, which isn't Nvidia's fault, ATI decided not to implement PhysX in there hardware.

Third, people tend to believe that CPU does physics fine, but that depends on how much physics are involved. PhysX, if implemented properly, can utilize all available cores, i.e. 8 threads on I7, but it won't execute on 4 cores effectively. It is best to assume their are 2 cores, or 4 for feature games. Of course a code that detects available cores and dynamically adjust the number of threads are ideal, but physics really isn't the only thing happening and such code is extremely hard to write.

PhysX will grow, as it has, simply because its performance can be increased with better video card while it still works without Nvidia card present. The part where PhysX affects gameplay is open for any video card, which runs off CPU and people somehow don't realize it. The only part where only Nvidia user can get and others can't, are those extra flying papers and smokes.

So Nvidia user have flying paper and ATI doesn't, so what? Nothing, it doesn't affect gameplay. That doesn't PhysX is useless. In fact, PhysX is more powerful than Havok because not only it can utilize CPU, but Nvidia GPU too. The question is, with DirectCompute, will MS favors Havok, allowing it to benefit from DirectCompute, thus it will also utilize GPU? Maybe, but not yet.

Look at it this way. We now have DirectCompute, will someone be smart enough to use it to recreate a set of API that does what PhysX does? It is possible. MS will probably do it, and call it DirectPhysics. However, the downside is it only works on MS, proprietary to MS. Will someone be smart enough to build a set of API under OpenCL that does what PhysX does? Yes, but not anytime soon as there are too many unknown bottlenecks in this case.

People tend to say that PhysX will not grow because it is proprietary to Nvidia, yet those people seems to forgot that "hardware tessellator" is proprietary to ATI, and yet tessellation became a standard in DirectX 11 as if MS created it. They don't have a problem that DirectX 11 is proprietary to MS, and never question why it doesn't work on unix based OS and mac. Some of these people believed that Mac can't play games because Mac sucks without having a sense of MS proprietary trick. However, MS is not to be blamed as Mac can build there own APIs that does whatever Dx does, but they didn't. Instead, Mac users have to relay on OpenGL.

So, is DirectX dying to OpenGL/CL? Will proprietary stuffs survives?
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Seero,

Bullet isn't the one to watch: STALKER-games use their proprietary X-Ray engine with physics based on a heavily modified ODE physics engine. These games are awesome and sport a very advanced, highly detailed environment including bullet and hit calculations or weapon degradation, day-night, sunny-rainy etc cycles, anomalies, radiations etc etc etc. Did I mention ODE is free?

PhysX vs Havok: it's safe to say that if you are a smaller or lamer studio you will go with PhysX - why? Because NV will save no money on helping you for free as long as you sign up for a TWIMTBP logo... this also means PhysX' list have higher rate of shitty games than Havok, pure and simple.

DX11/proprietary/tessellation etc: The way things make their way into DirectX is not MS-only but rather a spec list MS proposes to the IHVs based on their (IHV) input. FYI tessellation never been an ATI proprietary feature, that's nonsense. ATI introduced its own proprietary implementation of tessellation under the moniker Truform back in the R8500 and it was great but for some reason they never pushed it hard enough.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Seero,

Bullet isn't the one to watch: STALKER-games use their proprietary X-Ray engine with physics based on a heavily modified ODE physics engine. These games are awesome and sport a very advanced, highly detailed environment including bullet and hit calculations or weapon degradation, day-night, sunny-rainy etc cycles, anomalies, radiations etc etc etc. Did I mention ODE is free?

PhysX vs Havok: it's safe to say that if you are a smaller or lamer studio you will go with PhysX - why? Because NV will save no money on helping you for free as long as you sign up for a TWIMTBP logo... this also means PhysX' list have higher rate of shitty games than Havok, pure and simple.
Yes, you got the idea. TWIMTBP helps smaller and lamer studios to compete with big boys. What is the problem?

DX11/proprietary/tessellation etc: The way things make their way into DirectX is not MS-only but rather a spec list MS proposes to the IHVs based on their (IHV) input. FYI tessellation never been an ATI proprietary feature, that's nonsense. ATI introduced its own proprietary implementation of tessellation under the moniker Truform back in the R8500 and it was great but for some reason they never pushed it hard enough.
Simple question, can DirectX 11 run on Mac and/or unix based OS?

Please reread my post on the tessellation part.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Yes, you got the idea. TWIMTBP helps smaller and lamer studios to compete with big boys. What is the problem?

NOthing with that part. My point was that games with Havok tends to have higher quality, that's all.

Simple question, can DirectX 11 run on Mac and/or unix based OS?

Simple answer: who cares?

Please reread my post on the tessellation part.

I did, can't see cany changes - it's still wrong about ATI and tessellation, about MS-forced DirectX etc.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
let's be realistic here, there is a very significant number of ATi gpus out there compared to NV's. Unfortunately, same can't be said about the comparison between windows gamers vs all other OS's which are truly trivial minority when it comes to gaming. I agree you can't hold the majority responsible for having something works, but it's not like MS is trying to shove directx down the developers' throat. They just have to develop games in directX, since that's where the overwhelming majority games and where money is at. I am very doubtful the physx situation is remotely analogous to that.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
NOthing with that part. My point was that games with Havok tends to have higher quality, that's all.
I fail to see any examples where there is a direct comparison between PhysX and Havok in any games. I don't know how you come to the conclusion that Havok tends to have higher quality.

Simple answer: who cares?
If that is the answer, than who cares if PhysX is proprietary to Nvidia?

I did, can't see cany changes - it's still wrong about ATI and tessellation, about MS-forced DirectX etc.
ATI have a set of predefined APIs for its tessellation(Truform). Shall Nvidia implement Truform, it must support all APIs ATI defines. DirectX has a new set of APIs for tessellation. In this case both vendor can find a way to support these APIs. The bottom line is, when we say tessellation in game, it really means Dx11, not its ancestor, Truform.

The trick here is ATI uses the "hardware tessellator" to support these APIs, but Nvidia can't do the same thing as "hardware tessellator is patterned by API. However, the new GPGPU structure enabled the possibility to compute tessellation without having an actual tessellation unit.

Did Truform disappeared? I don't know. ATI may have support Dx11's tessellation through Truform's APIs. The fact that there is still a tessellator unit onboard indicated that parts of Truform stays.

Now would Nvidia support tessellation without Dx11? Probably not in a short time because they either use Truform's APIs, or create a new set. In short, it would have been 2 sets of APIs doing the same thing depending on vendors. Which was what Dx 9 was about.

Go back to PhysX. It is just a name for a set of APIs that does physics calculations. Unlike Truform, PhysX is much more used. MS can do what it has done on tessellation. Simply defines a new set of APIs that does physics and have video card vendors to support it. There is a simple reason why MS haven't done so, but the claim reminds true.

Creating a standard API for all vendors is a hard work. It causes a lot of resources. Besides, it will be a waste of resources if no one uses the new APIs. If you won't spend 100 bucks to support PhysX, an existing set of APIs, why will MS spend millions of dollars do so knowing that it will hurt CPU sells?
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
If NVidia was smart, they'd quit blocking ATI Users from using an Nvidia card for PhysX only operation. They'd Sell more Cards and control the Physics Market in no time. Instead they got greedy and made a Play for total Video/Physics domination, but that play has been stagnant and alternate Open Physics standards are catching up and will become the Physics standard.

Fail Nvidia.

you are completely wrong... it was ATI who is blocking people from using Physx hacks on ATI cards.
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
NOthing with that part. My point was that games with Havok tends to have higher quality, that's all.

.

absolutely wrong and lame response... typical lame fanboy mentality / response...

Havok is software based, its not going to beat Physx which is hardware based. not in a million years.

The truth is, its absolutely a shame that ATI doesn't adopt Physx... standardization takes times and support from the industry as a whole (ati + nvidia). its this kind of fanboi mentality that you just displayed "ATI is ALWAYS right" crap that made it "ok" for ATI to not adopt, and to go as far as releasing drivers to "block" people's effort to try to hack to enable the techonology.

had they adopted Physx and standardized it, the industry would be much better today. we would have games with much better physics and more interactive graphics, on the SAME hardware we are using now.

but instead of pushing a great technology, people such as you blindly agreed with the lame ATI to not adopt physx, and here we are buying more and more cpu cores to run physics in software, and get physics that isn't half as good.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
you are completely wrong... it was ATI who is blocking people from using Physx hacks on ATI cards.

So it was ATI who made the Nvidia drivers that blocks use of an Nvidia GPU when an ATI is present??? Thanks for clearing that up.

It's Nvidia's fault that ATI people have to do some crazy work arounds just to have it.

Like has been said many times before, Physx is good on paper but fails in operation. They need to open it up to get more companies to adopt it. If they don't it will be exactly where it's at now.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
I find that hard to believe. Nvidia has on several occasions publicly asked ATI to sign on.

Search on the bit-tech.net interview with Richard Huddy from AMD Developer Relations. Maybe he is lying but he claims nvidia refused to license to them (ATI).