• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Interesting take on Kanters article PhysX87: Software Deficiency

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

That list tells nothing of what was claimed.
Can you list me the developers that always have used havok (and have the experine tools in house) and the ones not using PhysX because they think it's worthless?
No?
Then the list is worthless.



You participated in that thread. I don't care if you ignore me. Not trying to please anyone. Not my fault you can't remember what you talked about earlier this month.

Lying won't help you.
Here is another one:
"PhysX is proparitary and that is bad!!!"
So is Havok.
So is X86.
So what?!

But I find it strange that all those that "despise" PhysX can't present argumentation based on facts, but only can regurgitate the same false, uniformed statments/lies.

Makes you wonder about the dislike of PhysX is not based on facts, but brand preferences.
Even DKanter didn't bash PhysX with facts, but false (and now disproven) assumptions.

One way we will find out is how certain pster will talk about Bullet...if AMD ever gets something out the door...ah well.
 
That list tells nothing of what was claimed.
Can you list me the developers that always have used havok (and have the experine tools in house) and the ones not using PhysX because they think it's worthless?
No?
Then the list is worthless.

So since you can't answer it you change the question all together then. Sound like a politician. You asked what games and I gave you a link that has said games in a thread you participated in yet seemed to have forgot.
 
And AMD simply doesn't have the resources to develop something like this.

They don't need to. Physx GPU accleration would be possible with an AMD system and dedicated physx card if NV didn't disable it in their drivers.

They could enable it and they'd still get plenty of people buying dedicated physx cards, but the way it is now you're forced to NOT buy anything AMD. I guess NV crunched the numbers and decided this would equal more sales. And their reason for this? Complete BS. Which you can read at the bottom of this page: http://www.nvidia.com/object/physx_faq.html#q8

It's already known that physx can work perfectly fine in an AMD/NV setup.

And this reason alone is why this whole discussion is irrelevant because it's obvious what motivates NV. The accusation that they intentially keep physx unoptimized is not surprising in the least. They make you believe physx is so advanced and demanding you need their latest and most expensive cards, or a dedicated card to get good performance.

I don't have all the "facts". So please correct me if you know different.
 
They don't need to. Physx GPU accleration would be possible with an AMD system and dedicated physx card if NV didn't disable it in their drivers.

They could enable it and they'd still get plenty of people buying dedicated physx cards, but the way it is now you're forced to NOT buy anything AMD. I guess NV crunched the numbers and decided this would equal more sales. And their reason for this? Complete BS. Which you can read at the bottom of this page: http://www.nvidia.com/object/physx_faq.html#q8

It's already known that physx can work perfectly fine in an AMD/NV setup.

And this reason alone is why this whole discussion is irrelevant because it's obvious what motivates NV. The accusation that they intentially keep physx unoptimized is not surprising in the least. They make you believe physx is so advanced and demanding you need their latest and most expensive cards, or a dedicated card to get good performance.

I don't have all the "facts". So please correct me if you know different.

Actually, there's really nothing stopping you from going hybrid physx to your heart's content right now except the current wave of tinfoil-hattery. Whether it's officially supported or not has really never stopped too much of the pc userbase in getting what they want.
Get your compatible drivers here, install your cards and game away!
Or alternatively, if you want to insist on more recent nvidia drivers for some reason but don't mind a couple more steps of automated gui installs, do this.

Now, as far as their motivations, which I assume from your insinuation is that they are unwilling to do amd's hardware physics homework for them so as to provide incentives for people to buy their stuff, are not too out of line with the actions of any other for-profit tech company in the current market and really doesn't come as a big surprise to me.
 
Now, as far as their motivations, which I assume from your insinuation is that they are unwilling to do amd's hardware physics homework for them so as to provide incentives for people to buy their stuff, are not too out of line with the actions of any other for-profit tech company in the current market and really doesn't come as a big surprise to me.

This is a circular argument that ends at a standstill (and I'm not just talking about your post but overall).

NVIDIA wants physX to be a big hit to sell more cards.

AMD has other priorities and places to spend the money (and wants to sell both GPUs and CPUs).

Developers don't want to risk losing a substantial portion of their player base, especially since many (or even most) develop first for the console anyway.

In the end physX has potential but the timing doesn't seem to be right.

Now, if people want to blame this company or that company for whatever they can keep doing it, but the reality is the one described above.
 
This is a circular argument that ends at a standstill (and I'm not just talking about your post but overall).

NVIDIA wants physX to be a big hit to sell more cards.

AMD has other priorities and places to spend the money (and wants to sell both GPUs and CPUs).

Developers don't want to risk losing a substantial portion of their player base, especially since many (or even most) develop first for the console anyway.

In the end physX has potential but the timing doesn't seem to be right.

Now, if people want to blame this company or that company for whatever they can keep doing it, but the reality is the one described above.

Right. I wasn't really trying to raise an argument about which particular cog is responsible for jamming hardware physics' widespread adoption nowadays; rather I was merely noting that I really don't see how nvidia's stance is so surprising and/or odious in response to the post I quoted.
Though I agree that a lot of people's gripes with particular companies are largely misguided.
 
They don't need to. Physx GPU accleration would be possible with an AMD system and dedicated physx card if NV didn't disable it in their drivers.

So AMD GPU's run CUDA now?
You could port PhysX to run on OpenCL...but why would you.
OpenCL is years behind CUDA as we speak.
And even in OpenCL NVIDIA is ahead of AMD with tools, support and framework.


They could enable it and they'd still get plenty of people buying dedicated physx cards, but the way it is now you're forced to NOT buy anything AMD. I guess NV crunched the numbers and decided this would equal more sales. And their reason for this? Complete BS. Which you can read at the bottom of this page: http://www.nvidia.com/object/physx_faq.html#q8

Guessing and conjecture, based on lack of understanding.
This is what I am talking about when I say arguments against PhysX is based on ignorance, not facts.

It's already known that physx can work perfectly fine in an AMD/NV setup.

Really, I havn't seen such extensive tests?
Care to share with the reast of the world?

Again, just stating that something is...dosn't make it so.


And this reason alone is why this whole discussion is irrelevant because it's obvious what motivates NV. The accusation that they intentially keep physx unoptimized is not surprising in the least. They make you believe physx is so advanced and demanding you need their latest and most expensive cards, or a dedicated card to get good performance.

*sigh*
This has already been debunked in this thread, would you mind reading the thread before posting again, as you are now postin FUD (litterally) that is debunked?

I don't have all the "facts". So please correct me if you know different.

I just did.
But why post about something you are cleary ignorant about?
Only thing you do is muddy the water with false assumptions, repeat false debunked claims and contribute nothing constructive to the thread.
you sir, are sadly a prime exsample of my statements about the dislike of PhysX not being based on facts.
 
So AMD GPU's run CUDA now?
You could port PhysX to run on OpenCL...but why would you.
OpenCL is years behind CUDA as we speak.
And even in OpenCL NVIDIA is ahead of AMD with tools, support and framework.

Take a second and re-read that again as you and others seem to misconstrue this all the time.

He didn't say anything about Nvidia's software running "on" the AMD GPU he said on on a dedicated PhysX card meaning an Nvidia card.

Guessing and conjecture, based on lack of understanding.
This is what I am talking about when I say arguments against PhysX is based on ignorance, not facts.

If it worked then why disable it? Not only disable it but if someone has an ATI/AMD IGP on the mobo then that triggers' Nvidia's block as well.

I just did.
But why post about something you are clearly ignorant about?
Only thing you do is muddy the water with false assumptions, repeat false debunked claims and contribute nothing constructive to the thread.
you sir, are sadly a prime example of my statements about the dislike of PhysX not being based on facts.

You're not actually correcting people. You're just arguing with people's points/opinions.

Sites have done tests with nice fancy graphs running AMD cards as the main with dedicated Nvidia cards for PhysX. It's late so I'm not going to bother looking for them now I find & post them later.
 
No, this is what was claimed:


Now please try again, this time sticking to the facts.

Umm you asked what games did Havok have game changing effects in. Like I said trying to change topics cus you have no answer. I'm done cus I've posted 3 times about this and you have no idea what you're asking.
 
Last edited:
They don't need to. Physx GPU accleration would be possible with an AMD system and dedicated physx card if NV didn't disable it in their drivers.

I'm talking about running PhysX (or any kind of physics acceleration) on an AMD GPU obviously.
It's not going to look good for AMD if you can only run PhysX by adding a separate nVidia GPU to your system, while with nVidia you only need one card. Most people would go for the single-card solution. Multi-GPU markets have always been a niche.

It's already known that physx can work perfectly fine in an AMD/NV setup.

It's not actually... Just a bunch of laymen on a forum who did some hack and said "Hum, it looks like it works".
You'd need a more thorough analysis to prove that the results are indeed correct in all cases.
nVidia didn't bother to do that analysis, and simply says "We haven't tested it, so we're not going to support it either". Testing and supporting the competition is just not interesting for nVidia.
Yes, it *could* work, it *might* even work. But what if it doesn't? If nVidia says it will work, they also need to support it (even if the problem is caused by AMD's drivers, and it just appears to be a PhysX problem... or people blame it on PhysX...). It's just simple business. Too many things already go wrong in games on simpler setups, with just videocards of a single brand.

The accusation that they intentially keep physx unoptimized is not surprising in the least.

Sadly the accusation is complete nonsense though.
 
I'm talking about running PhysX (or any kind of physics acceleration) on an AMD GPU obviously.
It's not going to look good for AMD if you can only run PhysX by adding a separate nVidia GPU to your system, while with nVidia you only need one card. Most people would go for the single-card solution. Multi-GPU markets have always been a niche.

Being able to run PhysX =/= good gaming experience. It's been shown over, and over and over again that you need a dedicated GPU in order to keep framerate somewhat stable with PhysX enabled. The argument is flawed.
 
Take a second and re-read that again as you and others seem to misconstrue this all the time.

He didn't say anything about Nvidia's software running "on" the AMD GPU he said on on a dedicated PhysX card meaning an Nvidia card.

Read Scali's post, the part about AMD's drivers.



If it worked then why disable it? Not only disable it but if someone has an ATI/AMD IGP on the mobo then that triggers' Nvidia's block as well.


Read Scali's post, the part about AMD's drivers.

You're not actually correcting people. You're just arguing with people's points/opinions.

False.
People in this thread has claimed PhysX isn't using SSE...it is now.
That is just one example.
This is the "jist" of this thread.
Debunked/false information being used as "argumentation" against PhysX.

Sites have done tests with nice fancy graphs running AMD cards as the main with dedicated Nvidia cards for PhysX. It's late so I'm not going to bother looking for them now I find & post them later.

I think that is called making a undocumented statement...the thread is full of them *sigh*
Did you need to add one more?
 
Umm you asked what games did Havok have game changing effects in.

No I didn't...would you stop with the false statments...it's tiresome to look at, here is the poster who made that claim:

Well since we have proven this to be either an outright lie or an uninformed false statement... I call your bluff and raise you this: "There is not ONE not ONE game where Havok affects graphics and immersion at all. We've already discussed the MANY games that use PHYSX and where it affects graphics."

What is it with you and false statements?

Like I said trying to change topics cus you have no answer.

*chough*
You "accuse" me of what others write and then point the finger at me, when your false claims get debunked? :hmm:
I'm done...*snip*

Thanks, false statements are useless, if you stop s´postin them the trahed will benefit.

*snip*...cus I've posted 3 times about this and you have no idea what you're asking.

I wrote to soon?
Yet another false statements...your last I hope?
 
Last edited:
Being able to run PhysX =/= good gaming experience. It's been shown over, and over and over again that you need a dedicated GPU in order to keep framerate somewhat stable with PhysX enabled. The argument is flawed.

What it is with people and flase statments?
It's that becuase you cannot make vailed arguments against PhysX?
So you need to lie/make false statements?

I have a GTX285 that run games from "Mirrors Egde" to "Mafia II" just fine...with PhysX enabled:

http://www.extremetech.com/article2/0,2845,2340018,00.asp

http://www.techspot.com/review/312-mafia2-performance/page7.html

Care to document your "claim"?
 
Being able to run PhysX =/= good gaming experience. It's been shown over, and over and over again that you need a dedicated GPU in order to keep framerate somewhat stable with PhysX enabled. The argument is flawed.

That is purely subjective, and also beside the point.
I am talking about the CAPABILITY.
AMD GPUs simply CANNOT run physics in any way.
nVidia's GPUs can run it, and you are free to add a second dedicated GPU for PhysX, or turn down the detail a bit to make the gaming experience better, if you so desire. But as we all know, GPUs get faster every year, so it's only a matter of time until even the most demanding users are happy with PhysX performance on a single card.

Your argument is flawed, not mine.
 
What it is with people and flase statments?
It's that becuase you cannot make vailed arguments against PhysX?
So you need to lie/make false statements?

I have a GTX285 that run games from "Mirrors Egde" to "Mafia II" just fine...with PhysX enabled:

http://www.extremetech.com/article2/0,2845,2340018,00.asp

http://www.techspot.com/review/312-mafia2-performance/page7.html

Care to document your "claim"?


Really ? When I enable physx in Mafia 2 on my system, it literally will cut my framerate by 50%. All the benchmarks out there of physx on high in this game show the same.

Considering the small amount of visual quality addition it actually brings to the game, it's a ridiculous performance hit. It's the same sort of performance hit you'd get from going from a resolution of 1680x1050 to 2560x1600 on the same video card setup.

That is what makes GPU physx a terrible feature and a non-starter in its current state. For such a small change in what you notice on your screen there is no reason for such huge performance costs. Until they fix that, gpu physx will continue to be a feature you see in one new game a year and something the buying market does not care about.
 
Really ? When I enable physx in Mafia 2 on my system, it literally will cut my framerate by 50%. All the benchmarks out there of physx on high in this game show the same.

50% of 100, 110, or 120 fps is still playable from my part of the world. Sign me up for the performance hit.
 
50% of 120 fps is still playable from my part of the world.

Not everyone is going to be using $1000 worth of video card hardware, as much as nvidia would like that.

And I generally got about 40fps with it on high, not 60.

Credit to nvidia for riding physx in on the coat-tails of a good game, it was a smart move, still doesn't make the feature any less crap.
 
Not everyone is going to be using $1000 worth of video card hardware, as much as nvidia would like that.

And I generally got about 40fps with it on high, not 60.

Sorry 40 fps. I realize that even though it's 10fps greater than what 90% of most consoles games run at, the level of playability is so completely low that Nvidia should patch out gpu-physx from every existing game so we don't get to decide for ourselves. How dare they stress their high end video cards!!

Credit to nvidia for riding physx in on the coat-tails of a good game, it was a smart move, still doesn't make the feature any less crap.

Credit to you for taking 1 game/example and using it as a basis for your entire argument involving all physx discussion. You know how to make your thoughts go the extra mile!
 
Sorry 40 fps. I realize that even though it's 10fps greater than what 90% of most consoles games run at, the level of playability is so completely low that Nvidia should patch out gpu-physx from every existing game so we don't get to decide for ourselves. How dare they stress their high end video cards!!

Since when has PC gaming been assessed on the standards of consoles ?

Many times it was not playable, feel free to enable physx on high in Mafia 2 and turn up the rest of the graphics settings, then come and share your experience. I'd like to hear your results.

Mafia 2 and the physx in it helped to further my opinion of physx being a resource pig considering the minimal additions it brought to the game. It's supposed to be the best example of it yet, is it not ?

I got a good chuckle when I walked around in a room and Awkward Looking Rock Chunk #754 jumped up from the floor and hit the ceiling when I walked over it.

Even more of a chuckle when I saw my framerate tank in a game that looked on par with a console port graphically, but some extra debris on the ground necessitated a 50% performance hit. At that point I turned physx off and enjoyed the rest of the game and didn't notice much difference.



Credit to you for taking 1 game/example and using it as a basis for your entire argument involving all physx discussion. You know how to make your thoughts go the extra mile!

There are only about 10 games that use GPU physx, so there a very limited base of games to draw examples from. Some of those are not even true games but free demos from nvidia or one level in a game.
 
Looks like most of you guys need to get out and breathe more often. All this serious angry mudslinging discussion about a feature that is possibly still in its infancy or on the brink of open source... Really? At the moment its a non-issue because it isnt even a mandatory feature in almost all games.

I almost get the urge to quote every hostile member and put up a "You Mad" caption pic at the end.

You know the reason why this forum ONLY consists or regulars anymore? Because of bickering and whining is off putting. Nobody wants to read that. How about a little nicer discussion rather than people twisting other peoples statements to fit their own agenda? Doesnt that sound better?
 
Looks like most of you guys need to get out and breathe more often. All this serious angry mudslinging discussion about a feature that is possibly still in its infancy or on the brink of open source... Really? At the moment its a non-issue because it isnt even a mandatory feature in almost all games.

I almost get the urge to quote every hostile member and put up a "You Mad" caption pic at the end.

You know the reason why this forum ONLY consists or regulars anymore? Because of bickering and whining is off putting. Nobody wants to read that. How about a little nicer discussion rather than people twisting other peoples statements to fit their own agenda? Doesnt that sound better?

4.5 years is a pretty long time in the PC world.
For something to still be in its infancy 4.5 years after it began isn't very good going.
 
Since when has PC gaming been assessed on the standards of consoles ?

Since you keep arbitrarily saying 50% is too much without saying what 50% is relation to. I made a point of reference, something which you have failed to do so many times in your "50% is too much" argument.

At that point I turned physx off and enjoyed the rest of the game and didn't notice much difference.

That's the beauty of this entire situation. You had the choice to enable or disable it, and you chose to disable it. At least we're given a choice, rather than someone, somewhere up high saying "50% is too much! Just get rid of it!"
 
Back
Top