AMD's Roy Taylor: PhysX/Cuda doomed?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
As for what AMD would do, look at the history of the company. Have they ever done anything on the level of Nvidia when it comes to exclusionary tactics? From a business perspective maybe they should have played dirty pool like their competition. But from a consumer perspective, it hurts the market in general. So that begs the question, people vehemently defending Nvidia's practices, why?

imho,

Hurts the market?

Where were you with native HD3d Support for only Radeon card owners? No biggie, based on AMD was trying to create awareness for Radeons and personally asked nVidia to do more and work with the developers!

Neither hurts the market but actually helps the market by offering more choice for their customers in unideal situations.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Guys the physx in Arma 3 is really something, true realism.

Indeed!

Arma Developer said:
You can't see any PhysX particle effects in the game right now. This technology is only for our internal use at this moment. We are working on implementation and optimization of all this. We would like to have advanced (PhysX) particles in the game, but we can't promise anything right now.

http://forums.bistudio.com/showthre...h-discussion&p=2416707&viewfull=1#post2416707
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
You have to give credit to managing to find the few redeeming qualities of gpu phsyx. It's got to be tough to dig deep enough to find the few slivers of gold in such a large pile of turd.
 

omeds

Senior member
Dec 14, 2011
646
13
81
So what I've gathered from this thread is PhysX is crap. TressFX is awesome. amirite?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
So what I've gathered from this thread is PhysX is crap. TressFX is awesome. amirite?

Of course you are. You're never wrong, after all it is your opinion. :)

I'd personally take PhysX over TressFX. But that's just my opinion. ;)
 

omeds

Senior member
Dec 14, 2011
646
13
81
Stop putting words in my mouth.I never said anything about TressFx being good.

PhysX is crap...just to be clear to you.

I think they're both good, but then again, [redacted].

Warning issued for inflammatory language.
-- stahlhart
 
Last edited by a moderator:

NIGELG

Senior member
Nov 4, 2009
852
31
91
I think they're both good, but then again, [redacted].
No they are not both good.

And yes you [redacted].

Warning issued for inflammatory language.
-- stahlhart
 
Last edited by a moderator:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The effects I'm mentioning are from this Youtube video.

There was a previous thread, forgot which part of the Anandtech forums, that pointed out different games that had similar/comparable particle effects WITHOUT needing PhysX. And yes, that was a thread discussing Hawken. Too lazy to search for it right now.

Well I'd honestly like to know, because I've never seen any non PhysX title use particle effects on such a scale, or use turbulence..

Now, it's been a while since I've seen that quote from nVidia. But it gets funny when you consider that nVidia is telling people that it can't be bothered to support its own paying customers. Yeah. Really. nVidia can't be bothered to support nVidia video cards as PhysX co-processors.

It's not just a matter of supporting their own hardware, but AMD hardware as well. If there is such a tight collaboration between the rendering and PhysX hardware as NVidia claims (and there's no reason to disbelieve them as nobody here programs drivers), then they would have to validate AMD drivers as well..

Now why should they do that?

nVidia is also took the time and resources to disable nVidia PhysX cards from working with AMD GPU's. Apparently nVidia cards as PhysX processors with AMD GPU's work pretty good (once you hack it to work), and that's with nVidia putting sabotaging support. nVidia could have easily just left the ability to use an nVidia card with AMD for PhysX. This would have greatly expanded PhysX support with nVidia selling much more low to mid range cards for use as PhysX processors.

Yes, NVidia did disable PhysX cards from working with AMD GPUs, and for technical reasons if what they say is true.

But even if it isn't, it's their right to do so. If they are comfortable with any sales losses caused by doing that, then that's on them..

You might want to look up what the original vision for PhysX was before Ageia was gobbled up by nVidia. A shame considering that PhysX now is for the most part used for nothing more than a few more shiny particles or flapping cloth. PhysX under nVidia's stewardship has been a mess.

I don't need to look it up. I remember it fairly well. I even played the Cell Factor demo on my CPU at the time. The fact that it could run on my CPU (can't even remember what I was running back then but I think it was a Conroe processor) goes to show that it wasn't very compute intensive..

And as has been mentioned several times before, the "eye candy physics" you scoff at is STILL physics, and is in fact much more difficult to run than the kind of physics you are glorifying..

While no one will argue with Intel wanting physics to remain on the CPU, Havok is not and never was moving towards becoming a CPU only physics solution. Havok can be accelerated by the GPU using OpenCL. Havok hasn't exactly been on the forefront of physics processing but look up Havok and GPU acceleration on the Playstation 4 as well which of course uses AMD CPU & GPU.

Very well, I'll concede this point.. But still, Havok is basically dead in the water right now. The Witcher 2 used Havok for physics, but now the Witcher 3 will be using PhysX.

That's pretty indicative..

From a business standpoint, it made sense for nVidia to loosen control in order to make more money in the long run. nVidia has made puzzling decisions over the years like half-assing CPU PhysX that seem to limit adoption of PhysX

You're postulating now. One thing we can be certain of, is that NVidia is out to make money. Whether it made more business sense for them to loosen control, or tighten control is unknowable to us..

And CPU PhysX and hardware accelerated PhysX have now merged for all intents and purposes. The only difference between them now is the code path they run, and the performance..

As it now stands, I think PhysX support will start to wane. Most PhysX support was due to the fact that nVidia was practically footing the bill for developers to implement PhysX.

This simply isn't true and is a oft quoted lie being perpetuated by the anti PhysX crowd.. NVidia does not pay developers to use PhysX. That's ridiculous.. What NVidia does do, is send developers software engineers to help them implement it in their games, but no financial support is given.

What happens when it makes more financial sense to build game engines using something like Havok which will run on AMD GPU's and nVidia GPU's. Not to mention the same game engine will run on game consoles.

Is Havok even being developed for GPU Physics since it's been taken over by Intel?
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
But does the market support Cuda and PhysX; to continue to support and invest for nVidia?

I would say yes based on their share, revenue and margins!

As far as PhysX goes, I've already given my opinion on that and my feeling is a more vendor agnostic solution (possibly Havok) is the one that will finally emerge victorious.

As for CUDA, it is currently the best for utilizing GPGPU but obviously only on an nVidia GPU. AMD simply doesn't have it's crap together on the compute front. This is definitely one front where nVidia has been absolutely dominant. I don't see that changing any time soon. For the foreseeable future, it's CUDA over OpenCL. The thing that may tilt things in favor of OpenCL eventually is support from Intel and AMD combined. But judging by Intel's lack of a viable high end GPU solution and AMD's many screwups, I wouldn't count on it.
 

omeds

Senior member
Dec 14, 2011
646
13
81
More have touched it than any alternative at this point. I have enjoyed using it in a number of games, which is better than 0.

I look foward to the day direct compute or some other API is used in every single game to end this bs and give us all PhysX level effects.
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
More have touched it than any alternative at this point.
There is no alternative to closed source Physx. Also no game devs have toched Physx since Boarderlands 2. Physx is in all of two new games Metro Last Light and Batman Orgins. That's two new games that were games that already had ties and contracts to run Physx so there rerally is not any new games that will run Physx. Crysis is now AMD, BF4 is AMD, and most other key developers relationships have jumped ship to AMD. Physx is dead.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
There is no alternative to closed source Physx. Also no game devs have toched Physx since Boarderlands 2.

Wrong. Sony Online Entertainment with Planetside 2 and Everquest Next and CDPR's Witcher 3 is in the pipeline.

There's a possibility that ARMA 3 may have GPU PhysX as well, but right now it's CPU only..

Physx is dead.

Keep dreaming dude ;)
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Well I'd honestly like to know, because I've never seen any non PhysX title use particle effects on such a scale, or use turbulence..

Here is the previous thread on Hawken.

It's not just a matter of supporting their own hardware, but AMD hardware as well. If there is such a tight collaboration between the rendering and PhysX hardware as NVidia claims (and there's no reason to disbelieve them as nobody here programs drivers), then they would have to validate AMD drivers as well..

Now why should they do that?

Yes, NVidia did disable PhysX cards from working with AMD GPUs, and for technical reasons if what they say is true.

But even if it isn't, it's their right to do so. If they are comfortable with any sales losses caused by doing that, then that's on them..

nVidia disabling PhysX from AMD always seemed more like retaliation than because of any real technical limitations. nVidia would need minimal work in order to ensure that PhysX works with AMD video cards. Go read the comments by those who actually have enabled an AMD card with an nVidia card for PhysX. Most of the work seems to be disabling the AMD lockout purposely put in by nVidia.

I never said nVidia wasn't within their rights. Just that they are lying when they said it's going to take a lot of time and resources to support AMD. From the community hacks to enable PhysX co-processing using an AMD/nVidia combo, it seems like most of the work is disabling the vendor lockout put in by nVidia. nVidia could even create goodwill by not locking out AMD. They can officially not support AMD and even put in a disclaimer saying it won't work with AMD.

The real kicker to me is that any cost associated with supporting AMD/nVidia PhysX combos would be offset by the extra revenue generated.

I don't need to look it up. I remember it fairly well. I even played the Cell Factor demo on my CPU at the time. The fact that it could run on my CPU (can't even remember what I was running back then but I think it was a Conroe processor) goes to show that it wasn't very compute intensive..

And as has been mentioned several times before, the "eye candy physics" you scoff at is STILL physics, and is in fact much more difficult to run than the kind of physics you are glorifying..

Yes, but since you want to sidestep the question, the original vision of PhysX was to use physics processing to enhance realism in games. Has it done that? No. Has it come anywhere close to that? No.

For all of the touting of PhysX being better, why is it that using it for its original purpose is too difficult for nVidia? Why is it only used for some flapping cloth and a few more sparklies? Both of which are available with other competing technologies.

Very well, I'll concede this point.. But still, Havok is basically dead in the water right now. The Witcher 2 used Havok for physics, but now the Witcher 3 will be using PhysX.

That's pretty indicative..

And Havok is dead in the water because it's not being touted as much? You think that new game engines will ignore Havok if they can create a game engine with physics that runs GPU accelerated on PC's and consoles? Yes, it's not as prevalent as PhysX but dead in the water? I'll respectfully disagree.

You're postulating now. One thing we can be certain of, is that NVidia is out to make money. Whether it made more business sense for them to loosen control, or tighten control is unknowable to us..

Well, judging by how dominant PhysX is (meaning not dominant at all), and how much developers pay to use PhysX (meaning nVidia foots most of the bill for PhysX integration) it seems like nVidia is doing a great job of making money off PhysX.

And CPU PhysX and hardware accelerated PhysX have now merged for all intents and purposes. The only difference between them now is the code path they run, and the performance..

They're still not equal. CPU PhysX still does not use the latest CPU acceleration technologies. And nVidia didn't update CPU PhysX until they got caught red handed half-assing CPU PhysX.

This simply isn't true and is a oft quoted lie being perpetuated by the anti PhysX crowd.. NVidia does not pay developers to use PhysX. That's ridiculous.. What NVidia does do, is send developers software engineers to help them implement it in their games, but no financial support is given.

I never said nVidia pays developers to integrate PhysX. Rather, nVidia provides developers and engineers to "help" implement PhysX. In other words, nVidia foots the bill for PhysX integration.

Is Havok even being developed for GPU Physics since it's been taken over by Intel?

Umm...see my comment about how Havok is being GPU accelerated on the Playstation 4 which is running on an AMD CPU and GPU. The latest version of Havok was unveiled earlier this year.
 

Mr Expert

Banned
Aug 8, 2013
175
0
0
Wrong. Sony Online Entertainment with Planetside 2 and Everquest Next and CDPR's Witcher 3 is in the pipeline.

There's a possibility that ARMA 3 may have GPU PhysX as well, but right now it's CPU only..



Keep dreaming dude ;)

And then list all the other games that used Havok and other Physx engines. BF3 and BFBC have far better Physics than nvidia can offer and they do it in a way that does not cripple the framrate or require a $500 dollar GPU just to run it wilth max settings. Planet Side runs like a mess and so does Arma and we will see about the Witcher 3 but i would bet that it will end up being CPU Physx only. Lets just hope that nvidia does not cripple performance if the put CPU Physx in the Witcher 3. Just look at how many new games jumped onboard with AMD this is a clear indicator that the market is phasing out Physx.
 

FiendishMind

Member
Aug 9, 2013
60
14
81
I registered to point out that you can learn about some the history behind PhysX at http://physxinfo.com/wiki/Main_Page and ask why Nvidia seems to get all the blame for not updating the SDK's x86 SIMD and Multithreading capabilities when Ageia had control of the software for four years before being bought by Nvidia and made no effort to update those capabilities either?
 
Status
Not open for further replies.