AMD's Roy Taylor: PhysX/Cuda doomed?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
What matters is what is the experience like for me. When gpu phsyx has for the most part been huge amounts of debris calculated on the fly with no game impact, it's just not worth it - calculated on the fly or not. It looks ridiculous and is a negative impact on gameplay: performance degradation.

You can adjust the amount of debris by lowering the setting, or even by turning it off..

CPU physics does not do the same while delivering an acceptable level of realism allowing better, more robust and game-changing physics in games.

CPU physics does not do the same because the CPU would get bogged down. As for realism, how can you complain about the debris generated by PhysX, while giving the "debris free" software physics a pass?

Shooting a stone column in real life with a machine gun is going to generate millions of bits of debris, visible and invisible to the naked eye. PhysX at least tries to replicate this, while software physics does not.

Disk and memory usage is pretty much irrelevant these days. Both are cheap and plentiful.

That's true I suppose.. Heck, probably the reason why the Frostbite 3 engine is 64 bit is because they have so many destruction animations which have to be loaded into memory..

It's mostly relegated to fluff eye-candy because gpu physx is just far, far too inefficient to do much more than that. Even playing the minor role it does in games it still comes at a whoppingly large performance cost.

I disagree. Look at Batman Arkham City or Mafia 2. Those games have a ton of cloth physics running at the same time. What software physics engine do you know of that can run that many instances of cloth physics on the screen at the same time?

There are none believe me..

Just fire up Borderlands 2 on even the best hardware and you get noticeable slowdowns. I do on my rig. GPU Physx simply performs too poorly to deliver the sort of large scale game-changing physics needed in a game like Battlefield.

PhysX on high in BL2 is buggy on SLI setups. You'd get better performance by getting a dedicated card, or by removing one of your Titans and playing the game on a single card.

Merely disabling SLI won't do the job properly, as SLI will still be active at the driver level..

Those are really the perfect examples of what nvidia does with gpu physx. Talk a lot and show pretty tech demos and the reality is actually this

Like I said before, the authenticity of the simulations is proportional to the amount of compute power available. 2 years from now when Volta is available, GPUs will be able to run games with that kind of visual fidelity because they'll be so powerful..

http://www.youtube.com/watch?v=fK0Lwtz6eAs

Unrealistic gelatinous water, nothing special and a frame rate hit to make it worthless. It seems like physx has the same assets it pulls from for every game as every gpu physx game has that same ridiculous water. Similar to every one having those same looking pebbles spraying everywhere. I think they need to work on diversifying the art before they even start to worry about adding it to more games.

Come on, do you honestly expect BL2 to have realistic looking water, when water isn't even a central, or even secondary theme in the game?

Also, considering the animated and over the top art style of BL2, I would never have expected them to put a great amount of effort into making sure water behaved and looked realistically..

A game like Cryostasis which was released in 2009 had better water effects, because water was a major theme in the game so developers put more effort into it

Developer adoption really says it all about gpu physx. If it was the compelling game-changer they claim it is, the people actually making games would be making use of it. As it stands you only see it the sequels to two running franchises and a couple free 2 play games. Six years have gone by to work on the technology and still developers are overwhelmingly disinterested and that says it all.

The fact that hardware accelerated PhysX only runs on NVidia GPUs is definitely hindering adoption no doubt, but considering that, PhysX has scored some big titles:

1) Metro 2033

2) Metro Last Light

3) Mafia 2

4) Alice the Madness Returns

5) Batman Arkham Asylum

6) Batman Arkham City

7) Batman Arkham Origins

8) The Witcher 3

9) Borderlands 2

10) Planetside 2

11) Everquest Next
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
The fact that hardware accelerated PhysX only runs on NVidia GPUs is definitely hindering adoption no doubt, but considering that, PhysX has scored some big titles:

All the newer games with physx

1) Metro Last Light

2) Batman Arkham Origins

3) The Witcher 3

That's a very weak linup. Nobody will adopt Physx in the future even more because all the new consoles will be AMD powerd. This means nvidias polished turd (GPU accerated Physx) will finally be gone forever in due time. I say good riddence as Physx is all marketing and show and no go.
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
ps4 is using physx.
LOL you do understand that there is a big difference between Hardware Accelerated Physx and CPU Physx right. If there is not a big difference between hardware accelerated and CPU Physx then nvidia are trully scum bag marketers looking to rip off those uniformed and unwilling.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That's a very weak linup. Nobody will adopt Physx in the future even more because all the new consoles will be AMD powerd. This means nvidias polished turd (GPU accerated Physx) will finally be gone forever in due time. I say good riddence as Physx is all marketing and show and no go.

Both the PS4 and Xbox One will be capable of using PhysX :awe:

GPU PhysX will never go away, because it's merely an evolution of the software PhysX and as long as software PhysX is around (Unreal Engine 4 has it integrated into the engine), hardware accelerated PhysX will be here to stay :hmm:
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
LOL you do understand that there is a big difference between Hardware Accelerated Physx and CPU Physx right. If there is not a big difference between hardware accelerated and CPU Physx then nvidia are trully scum bag marketers looking to rip off those uniformed and unwilling.

Hardware accelerated PhysX and software PhysX converged a long time ago. You can run "hardware accelerated" physics on a CPU, it will just run much slower..

So they're basically the same thing, except they run different code paths. It will be interesting to see how PhysX 3.0 (which has extensive multithreading and SIMD optimizations) perform in heavy PhysX titles..
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
Hardware accelerated PhysX and software PhysX converged a long time ago. You can run "hardware accelerated" physics on a CPU, it will just run much slower..

So they're basically the same thing, except they run different code paths. It will be interesting to see how PhysX 3.0 (which has extensive multithreading and SIMD optimizations) perform in heavy PhysX titles..
You are sadly misinfomed to the point that is being made here.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Theres no difference. "Hardware accelerated physics" A CPU is considered hardware. They made their own physics API and made it to run only on their gpgpu. "physics" have been around since before Nvidia & Apex were companies. They used gpgpu to harness the GPUs higher capabilities to relieve the CPU from traditionally calculating game physics, except its poorly done, usually over done and results in a lower framerate when the whole goal was to relieve the CPU for better performance. The effect potential is there and is like 100 times greater than whats possible on a CPU, but they waste it.... Like making 1 billion particles go on screen for a silly overdone effect and remain there.

They failed by crippling it for AMD and not allowing it on AMD cards. It's there as an API just like Havok, but it's purposefully optimized for NV hardware, and purposefully crippled for AMD hardware. They took it a step further to create on-die logic dedicated to it. The reason it isn't getting huge success is because it isnt a standard... not like it's not free, they pay you out the ass to use it. Then they want you to make it an "Nvidia only Exclusive feature" to your game. AAA titles like Batman & Borderlands remove 50% of their special effects & features and make them Nvidia only.

The other 95% of the industry recognizes this as splitting the community and crippling potential players and customers, and refuses to add their proprietary tech to their products. AMD having all the console hardware for the next decade should make it obvious to you. Its been a blunder from the start. It has awesome potential but they're going about it all wrong developing a physics API that uses GPGPU and wont work on any console or 50% of PCs...
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You are sadly misinfomed to the point that is being made here.

How am I misinformed? You can run the highest PhysX setting on your CPU if you want, so obviously, there isn't a big different between hardware and software physics other than the performance..

border_bench5.png
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Unless the plan to coax developers into adding "PC ONLY Nvidia exclusive GPU PhysX features" until 2020.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's there as an API just like Havok, but it's purposefully optimized for NV hardware, and purposefully crippled for AMD hardware.

Imho,

Havok and PhysX are middlewares -- Cuda and OpenCL are API's and this is where the battle is for AMD!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Unless the plan to coax developers into adding "PC ONLY Nvidia exclusive GPU PhysX features" until 2020.

That may depend if their competitors can do something about it instead of talking, attacking and losing share!
 

Mr Expert

Banned
Aug 8, 2013
175
0
0

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
What other games [with the particle effects only available with PhysX] are those?

Added bracketed comment to put your quote in the correct context for those reading.

The effects I'm mentioning are from this Youtube video.

There was a previous thread, forgot which part of the Anandtech forums, that pointed out different games that had similar/comparable particle effects WITHOUT needing PhysX. And yes, that was a thread discussing Hawken. Too lazy to search for it right now.

nVidia said:
No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics.

I guess they don't want to have to do any kind of verification process for AMD cards, which takes additional time and expense.
Now, it's been a while since I've seen that quote from nVidia. But it gets funny when you consider that nVidia is telling people that it can't be bothered to support its own paying customers. Yeah. Really. nVidia can't be bothered to support nVidia video cards as PhysX co-processors.

nVidia is also took the time and resources to disable nVidia PhysX cards from working with AMD GPU's. Apparently nVidia cards as PhysX processors with AMD GPU's work pretty good (once you hack it to work), and that's with nVidia putting sabotaging support. nVidia could have easily just left the ability to use an nVidia card with AMD for PhysX. This would have greatly expanded PhysX support with nVidia selling much more low to mid range cards for use as PhysX processors.

Neither of us, nor anyone on this forum knows the terms that were offered to AMD or why AMD never took NVidia on it's offer, so it's pointless to debate..
True, but it doesn't change the fact that from a business standpoint, it would be stupid to think AMD refused to support PhysX if it was free with no strings attached.

When GPU accelerated physics first became a possibility, it was very unpopular. No one but NVidia wanted to support or develop it.
You might want to look up what the original vision for PhysX was before Ageia was gobbled up by nVidia. A shame considering that PhysX now is for the most part used for nothing more than a few more shiny particles or flapping cloth. PhysX under nVidia's stewardship has been a mess.

Intel of course wanted physics to remain on the CPU (which was why they bought Havok), and AMD scoffed at running physics on the GPU.
While no one will argue with Intel wanting physics to remain on the CPU, Havok is not and never was moving towards becoming a CPU only physics solution. Havok can be accelerated by the GPU using OpenCL. Havok hasn't exactly been on the forefront of physics processing but look up Havok and GPU acceleration on the Playstation 4 as well which of course uses AMD CPU & GPU.

But now after all these years and tons of investment by NVidia in time, effort and resources, you honestly think NVidia is just going to give control of PhysX over to it's competitors, especially when it's going so well for them? That would be a foolish move on their part.

Besides, it's not as though AMD doesn't have their own plans. They have Bullet, which is based on OpenCl. It's not nearly as polished and refined as PhysX, but who knows, it may get there one day..
From a business standpoint, it made sense for nVidia to loosen control in order to make more money in the long run. nVidia has made puzzling decisions over the years like half-assing CPU PhysX that seem to limit adoption of PhysX.

As it now stands, I think PhysX support will start to wane. Most PhysX support was due to the fact that nVidia was practically footing the bill for developers to implement PhysX. What happens when it makes more financial sense to build game engines using something like Havok which will run on AMD GPU's and nVidia GPU's. Not to mention the same game engine will run on game consoles.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
It's the point that nvidia was using shady marketing all these years even though Physx on the CPU is what everyone wants and is in fact better.

http://semiaccurate.com/2010/07/07/nvidia-purposefully-hobbles-physx-cpu/

Good article there. This is something everyone should check out, and his source is D. Kanter - a guy who is well informed and unbiased and one of the few folks with a web presence who is more technically inclined than Anand. Reciting his article makes Charlie look smart. X87, Multithreaded for NV gpus & gaming console cpus, but not desktop CPUs... Prohibited on AMD gpus.... Jokes?

Just like the shield is a vehicle to move unsold Tegra4 chips, PhysX is an edge to corner the market. It has great potential (NV says 4x more power than a CPU - which I believe but why make it proprietary?) But since they're up against Intel, MS, AMD, x86, x64, multicore CPUs, OpenCL, 360, PS3, Wii, WiiU, Xbone, & PS4 how in the heck do they think they can win? Lol get real. They're taking a sky's-the-limit approach with "GPU PhysX" & "Hardware Accelerated Physics" and any other cliche hollow marketing terminology they can drum up for a proprietary tech. What a mess. That AMD guys right when he says CUDA & PhysX are 'doomed'. On the bright side, NV has a lot of success with Quadro & Tesla & HPC
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
Good article there. This is something everyone should check out, and his source is D. Kanter - a guy who is well informed and unbiased and one of the few folks with a web presence who is more technically inclined than Anand. Reciting his article makes Charlie look smart. X87, Multithreaded for NV gpus & gaming console cpus, but not desktop CPUs... Prohibited on AMD gpus.... Jokes?

Just like the shield is a vehicle to move unsold Tegra4 chips, PhysX is an edge to corner the market. It has great potential (NV says 4x more power than a CPU - which I believe but why make it proprietary?) But since they're up against Intel, MS, AMD, x86, x64, multicore CPUs, OpenCL, 360, PS3, Wii, WiiU, Xbone, & PS4 how in the heck do they think they can win? Lol get real. They're taking a sky's-the-limit approach with "GPU PhysX" & "Hardware Accelerated Physics" and any other cliche hollow marketing terminology they can drum up for a proprietary tech. What a mess. That AMD guys right when he says CUDA & PhysX are 'doomed'. On the bright side, NV has a lot of success with Quadro & Tesla & HPC
nvidia took a big risk when they bought out the Physx tech from Ageia but the fact is they don't need it to prosper cause as we can see it's not even being used in any tangible way in recent times. I just can't understand why all the fanboys and nvidia themselves keep trying to paint a sinking ship.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
For the argument that in BF all buildings fall down into the same pieces, this is partly because its a multiplayer game.

When there is a pile of debris that the player can hide behind, that debris has to be identical FOR EVERYBODY. If the big chunk you are hiding behind is not there for other players, suddenly you are in the open. So things have to be a certain way so all players see it as the same. Everybody having random/dynamic debris would make the game feel very buggy. As suddenly it would seem like you just go shot through a chunk of concrete. But only because the chunk is not there for somebody else.

And thats why PhysX does NOT have any impact on actual game play. It is for client side eye candy only. Not to mention only half the gamers could even play with PhysX (other half are AMD), so you cant have the game rely on something that is not available across the board.

I think that they are simulating the ocean in BF4 server side so everyone sees the same waves. Pretty cool if you ask me.
 

seitur

Senior member
Jul 12, 2013
383
1
81
Whatever you say, current situation with lack of open standard for good enough GPU physics engine does hurt games in general.

Imagine what could happen if GPU physX was open standard, working on GPU from all vendors.

I do understand Nvidia, that it want to use competetive advantage to it's own benefit, after all Nvidia fuction is to bring money to it's shareholders and not to please people, but noone can deny that current situation is bad for us as consumers.
 
Status
Not open for further replies.