Physx - Are you interested in it? Have your say! VOTE!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Physx - rate the importance if you care or not

  • Physx - what's that?

  • Physx - no thanks! (Unimpressed)

  • Physx - neutral

  • Physx - nice extra if price / performance lines up.

  • Physx - factors in the decision

  • Physx - must have! (Diehard fan)


Results are only viewable after voting.

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
AMD is free to come up with their own GPU physics. They advertised that feature on old cards, I believe the 2900xt or something. Never materialized. Someone else, a 3rd party can also come up with their own GPU physics API for developers that uses OpenCL or something. Hasn't happened.

Until then we have Nvidia trying to bring extra value to their customers and no real competition for it.

AMD have tressfx, which is not artificially locked. Does it mean it is not AMDs?
On top of that they help develop all these Gaming Evolved titles.

How is artificially disabling features from competitors hardware users adding extra value?
 

Tohtori

Member
Aug 27, 2013
51
2
36
I am all for physics that impact the game like Red Faction, but that can't happen with GPU accelerated PhysX and therefore I don't care.
Also I find most of the effects I see with PhysX too flamboyant and they reduce my immersion into games.

But I think PhysX has potential if things change and PhysX can be accelerated by all GPUs

This! Nvidia releasing PhysX would do all kinds of good to gaming industry.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I think AMD is betting that other physics software (Havok) will make GPU physics work on both brands and largely solve the problem.

I like physX when its there, its a factor but a minimal one when choosing a GPU. But I so wish for the evolution stage to convert into a proper standard and a competitive landscape soon.
 

Jaskalas

Lifer
Jun 23, 2004
35,847
10,155
136
Some kind of proprietary physics engine for NVIDIA hardware? Don't know a thing about it. I mean, if you want to poll if I like physics in game engines then sure, love it, fine with it. It's a little icing on the cake.

Half Life 2 was a good demonstration of physics, right?

I've no idea why I should be concerned with some hardware vendor's idea of how to achieve similar results.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Not interested in PhysX as long as it only works on Nvidia hardware. I'm not going to buy an Nvidia graphics card if AMD has one that is faster for the same price or cheaper for the same performance.

There's also all of the sneaky underhanded stuff Nvidia does such as blocking PhysX from working if I have an AMD video card for primary graphics and buy an Nvidia card just for PhysX. The fact that Nvidia even does that makes me less eager to support them.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
AMD have tressfx, which is not artificially locked. Does it mean it is not AMDs?
On top of that they help develop all these Gaming Evolved titles.

How is artificially disabling features from competitors hardware users adding extra value?

That doesn't matter. You ignored my entire post. AMD can make their own GPU Physics API. They don't or won't. It's not Nvidia's problem.

Not interested in PhysX as long as it only works on Nvidia hardware. I'm not going to buy an Nvidia graphics card if AMD has one that is faster for the same price or cheaper for the same performance.

There's also all of the sneaky underhanded stuff Nvidia does such as blocking PhysX from working if I have an AMD video card for primary graphics and buy an Nvidia card just for PhysX. The fact that Nvidia even does that makes me less eager to support them.

Nvidia owns physx, they can do whatever the hell they want with it. Everyone hates success these days, it's the cool thing to do.

You guys act like you wouldn't make the same business decision in their position.

It entices their customers with unique features you can't get elsewhere. It also does entice developers too that's why the WItcher 3 and other titles keep coming out with Physx usage in their games. Stop blaming Nvidia and blame AMD for not competiing in this area.

We have had numerous threads about this and everyone says Physx is dead or it sucks, but nobody can show an alternative and games keep being released that make use of it. Hate it or whatever if you want, but I don't see why we need a new thread every week with the same tired arguments.
 
Last edited:

Tohtori

Member
Aug 27, 2013
51
2
36
That doesn't matter. You ignored my entire post. AMD can make their own GPU Physics API. They don't or won't. It's not Nvidia's problem.



Nvidia owns physx, they can do whatever the hell they want with it. Everyone hates success these days, it's the cool thing to do.

You guys act like you wouldn't make the same business decision in their position.

It entices their customers with unique features you can't get elsewhere. It also does entice developers too that's why the WItcher 3 and other titles keep coming out with Physx usage in their games. Stop blaming Nvidia and blame AMD for not competiing in this area.

We have had numerous threads about this and everyone says Physx is dead or it sucks, but nobody can show an alternative and games keep being released that make use of it. Hate it or whatever if you want, but I don't see why we need a new thread every week with the same tired arguments.

Yeah right, except Nvidia does nothing with PhysX, DEVELOPERS do. And what do they do with it? Eye candy.

If PhysX wouldn't be limited to Nvidia only imagine all the cool stuff it could be used for instead just another mindless graphic effect. As it is no developer will code any meaningful use for PhysX unless paid truckloads by Nvidia, which is a LOSS for everyone involved.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
When done right it's really good. But most often to me the visuals look ridiculous and fake, and worse are pumped up to scream "look at me! I am Nvidia I can do this!!!!" which ruins the immersion factor in a big way.

Factor in the performance hit and proprietary nature, not interested in it much at all. I went through the trouble of hacking in the hybrid fix, meh won't bother doing it again not worth it. Bottom line is the good old CPU is often better at this kind of thing, the ideal PhysX model would be to properly use whatever resources are available, not limit itself to NV hardware just as a selling feature.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yeah right, except Nvidia does nothing with PhysX, DEVELOPERS do. And what do they do with it? Eye candy.

If PhysX wouldn't be limited to Nvidia only imagine all the cool stuff it could be used for instead just another mindless graphic effect. As it is no developer will code any meaningful use for PhysX unless paid truckloads by Nvidia, which is a LOSS for everyone involved.

Physics is just a graphical effect...that won't change ever.

For the record no CPU physics can do real time water particles effects that are interactive and dynamic. Real time physics requires a lot of horsepower. Havok has never been used the same way some physx demonstrations have been. In the future who knows but until someone shows a real alternative nvidia will always do what they have done with physx since day one.
 
Last edited:

Spjut

Senior member
Apr 9, 2011
933
163
106
It's a bonus that's nice to have if the horsepower is there, but I don't really care about it.

I used my old 8800GT for dedicated PhysX until Nvidia blocked it in the newer drivers(a real low blow I'd say), but I didn't care enough about it to bother hacking the drivers.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I didn't buy an Nvidia card because of it, but I don't mind having it.

I still think it would be better for everyone of they made it free for anybody to use.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
CPU physics (using Havok or software PhysX or any other library) is "good enough" for me 99% of the time. I voted neutral.

I haven't bothered turning on hardware PhysX in borderlands 2 to see the extra cloth effects and corrosive goop effects.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I voted neutral because I end up disabling it. I don't need debris flying across the screen distracting me... and frame rate drops. Had a titan and a 780... Frames tank when PhysX is enabled.


To whomever said PhysX was a success. Its not, its a preference and a rare one at that. I know someone who raves about PhysX and says that he loves the PhysX effects in BF3. I had to break the news to him that BF3 doesn't have PhysX but he won't back down that his GTX 670 does something special with the game that the AMD cards do not. Nvidia marketing at its finest I guess.
 
Last edited:

hyrule4927

Senior member
Feb 9, 2012
359
1
76
I went with "Physx - nice extra if price / performance lines up."

I enjoyed PhysX in Borderlands 2 enough that I picked up a super cheap 550 Ti from a friend so I could run hybrid PhysX. But, I can't see PhysX ever being a determining factor when choosing the primary GPU for my system. In the long term, I would certainly prefer that game developers not use any sort of proprietary graphical features.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
Proprietary standards are generally bad for consumers, but the advances Nvidia makes while profiting on their proprietary standard may help pave the way for widespread adoption of an open physics API.
 
Aug 11, 2008
10,451
642
126
I have an AMD gpu and turned on cpu physX in Borderlands 2. My comp ran it OK with a few slowdowns now and then. I eventually turned it off though. I actually found the effects kind of distracting, and didnt really feel they added much to the game.

Obviously I dont have that much experience with physX in other games, but I voted "unimpressed". By that I basically mean that it would not influence my purchasing of a game or gpu based on whether it supported physX.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
The only game I own with PhysX is The Bureau: XCOM Declassified. And I only noticed that after finishing the game. Turned it on, played through a mission, tagged (Unimpressed). The effects didn't blend in well at all and had quite a hefty performance hit on the system I tested it on.

I do love Physics in games when they are done right, like in Guildwars 2 (loved the little details like swaying grass), Portal 1/2 and the Battlefield series, but all of them apparently use Havok.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
AMD have tressfx, which is not artificially locked. Does it mean it is not AMDs?
On top of that they help develop all these Gaming Evolved titles.

TressFX uses DirectCompute, a proprietary technology from Microsoft..

So being proprietary doesn't mean it's automatically closed to other companies. AMD could easily license CUDA if they wanted to for dirt cheap, but they won't.

And they seemingly don't want to spend the resources necessary to bring OpenCL based Bullet physics up to par with PhysX.

So AMD users are basically screwed..

How is artificially disabling features from competitors hardware users adding extra value?

You seem to be under the illusion that companies should not be able to gain financially from their own IP.

NVidia has developed and refined CUDA and PhysX for years, spending tens of millions of dollars. Now they're just supposed to hand it over to AMD on a silver platter without any financial compensation at all?

What planet do you live on?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Physics is just a graphical effect...that won't change ever.

I have to disagree with this. Game physics is actual physics, if it is being calculated in real time..
An animation is just a graphical effect, which uses no real time physics computations..
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I do love Physics in games when they are done right, like in Guildwars 2 (loved the little details like swaying grass), Portal 1/2 and the Battlefield series, but all of them apparently use Havok.

A lot of those effects that you think are physics are really just canned animations, which is a graphical effect with baked in physics. So basically, the CPU isn't doing any real time computations to enable these effects.

The Battlefield games uses a lot of that, with perhaps a small bit of real time computation here and there to add variety. You can tell explosions and weapon impacts in BF3 uses canned animations because the rubble disappears as soon as it hits the ground, or even before it hits the ground.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Chose the "nice extra if price/perf lines up." I mean, assuming the price/performance of the card you are looking at is on par with the competition, then you start having to look at the other bullet feature points. If the video card with PhysX costs more, then PhysX has zero consideration when considering value. It's that relevant.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
TressFX uses DirectCompute, a proprietary technology from Microsoft..

So being proprietary doesn't mean it's automatically closed to other companies. AMD could easily license CUDA if they wanted to for dirt cheap, but they won't.

And they seemingly don't want to spend the resources necessary to bring OpenCL based Bullet physics up to par with PhysX.

So AMD users are basically screwed..



You seem to be under the illusion that companies should not be able to gain financially from their own IP.

NVidia has developed and refined CUDA and PhysX for years, spending tens of millions of dollars. Now they're just supposed to hand it over to AMD on a silver platter without any financial compensation at all?

What planet do you live on?

Directcompute is developed by MS yes, however it does not require outlandish licensing fees in order to use. It is included as part of the DirectX API.

PhysX on the other hand is only available for nVidia hardware. And after nVidia bought PhysX, they purposed decreased the performance of PhysX on CPU's (Which was excellent before hand) to force people into using their hardware.

The fact is proprietary API's that are not licensable for reasonable prices are *BAD* for consumers, period. And don't forget that nVidia did not create PhysX, they bought it, just like they bought SLI.

if nVidia really wanted PhysX to be successful, they would license it to Intel and AMD for reasonable cost, and consumers everywhere would be better off for it. And I would like to stress the "reasonable" part, as nVidia did offer it to AMD, but it was at a purposely obscenely high amount so that they would be the only ones that supported it.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
GPU physx effects costs consumers no extra money on the games it's in, and can easily be turned off. Other than Nvidia not supporting AMD GPU's, what is there to complain about if it can be turned off as easily as it can be turned on? It isn't being forced on consumers by any stretch of the imagination, and the games that have GPU physx would otherwise be near identical ports of their console counterparts without the effects.

I chose nice if extra price / performance lines up, but really that makes no sense as an option since games with physx don't cost more than games without physx. That option should read "nice if available and performs well with GPU in use."
 
Last edited: