"Inevitable Bleak Outcome for nVidia's Cuda + Physx Strategy"

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Scali
Originally posted by: munky
There's a good reason for that. Havok has been around longer than Physx.

Yea, but many developers don't like paying for Havok. Especially smaller developers.
Look at a game like I-Fluid. That would never have been possible with Havok because a small game studio like that could never afford Havok.
I-Fluid is hardly the kind of game I would spend time playing.


Originally posted by: munky
Doesn't mean anyone else should adopt those standards. Hence the low number of games using GPU-accelerated physx.

That's still more games than using any other kind of physics acceleration.
Yeah, and? I have an NV card, and none of those games are impressive enough for me to use GPU-physx.

Originally posted by: munky
I've heard that one before too. Except that Intel has no appropriate HW to compete with AMD's gpu's in physics accelerations, and in fact Intel has no gpu-accelerated physics implementation at all.

Not yet... Then again, ATi doesn't have any gpu-accelerated physics yet either.
I wouldn't be surprised if the accelerated version of Havok is relased at the same time as Larrabee, sometime next year.
And believe me, Larrabee is going to do VERY well in physics.
All Larrabee speculation aside, it's pretty obvious why AMD chose Havok and not Physx.


Originally posted by: munky
So you make time to create one for that purpose, instead of doing a rush-job implementation so you can put another checkbox on your marketing slides.

If you think Cuda is a rush-job, we're done talking.

We aren't talking about Cuda, but rather the implementation of Physx based on Cuda.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: munky

Why, because Intel is a serious threat to AMD's GPU business? Didn't think so.

Did AMD stop selling CPUs? Did Intel stop work on Larrabee? :confused:

AMD is cutting off their nose to spite their face and it will cost them I'm sure.

There's a good reason for that. Havok has been around longer than Physx.
Since we are talking about GPU physics, who cares about Havok. That's a discussion for the "CPU" forum. Until at least one game is released using Havok on the GPU it's all smoke and mirrors.

Pot, meet kettle.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SlowSpyder
Originally posted by: Scali
Originally posted by: SlowSpyder
Exactly my point of the post of mine you quoted. I am a gamer, I enjoy plaing many different types of games on my computer, yet from what I've seen of Physx, I just do not feel that I am missing out on anything, as it is right now. Will Physx be a must-have down the road? It may be. Right now it's very underwhelming.

Haven't we heard this many times before though?
Like with DX10 and Crysis for example... "Oh, you can barely tell the difference with DX9, and it's slower anyway... besides you need to get Vista".
Or with DX8 vs DX9, or SM2.0 vs SM3.0....
Basically everytime some new technology popped up.

No, most technologies aren't going to take the world by storm... But if you look back... I think DX10 and Vista/Windows 7 are now pretty accepted, and most people who have DX10 will run Crysis in DX10 mode, even though DX9 may look *almost* as good and get higher framerates.
Just like I think most people with an nVidia card will enable PhysX effects in games like Mirror's Edge and Cryostasis, just because it makes it look that bit cooler and more realistic.
In a few years we may all have hardware that supports it, and we'll just laugh at these silly discussions... "Wow, did anyone ever think that DX10/PhysX would NOT be cool in games?"


How long ago did Physx launch though? Is it still new? It's been on the market for a while now (more than 2 years, right?) and how many games support it?

When the C2D launched even AMD fan boys had to have a rig powered by it. How long did it take for hardware T&L to make any card without that obsolete? How long did it take for DX9 to completely make DX8 obsolete? Some techs take longer than others to really take off. How long has Physx been around now?

Unfair. Ageia was a small hardware shop with next to no sales and next to no software support from devs. Plainly, they were just too small. Nvidia has owned PhysX now for just a year, and already we have PhysX games. During the past year, how many tier 1, 2, 3 devs signed on to PhysX? How long do you think it takes to write a PhysX based game? It won't be long before there is a flood of PhysX games. And don't say you've been hearing this for years now, because it's only been about 1 year since Nvidia purchased Ageia. February thru April 2008 I believe from the purchase agreement to the actual closing of the deal. Point is, to have ANYTHING to show for it this soon is pretty impressive. Mirrors Edge (not for everyone, but many found it pretty cool), Cryostasis now. They'll just keep coming in now on a regular basis. It's time IMHO.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Keysplayr
Originally posted by: SlowSpyder
Originally posted by: Scali
Originally posted by: SlowSpyder
Exactly my point of the post of mine you quoted. I am a gamer, I enjoy plaing many different types of games on my computer, yet from what I've seen of Physx, I just do not feel that I am missing out on anything, as it is right now. Will Physx be a must-have down the road? It may be. Right now it's very underwhelming.

Haven't we heard this many times before though?
Like with DX10 and Crysis for example... "Oh, you can barely tell the difference with DX9, and it's slower anyway... besides you need to get Vista".
Or with DX8 vs DX9, or SM2.0 vs SM3.0....
Basically everytime some new technology popped up.

No, most technologies aren't going to take the world by storm... But if you look back... I think DX10 and Vista/Windows 7 are now pretty accepted, and most people who have DX10 will run Crysis in DX10 mode, even though DX9 may look *almost* as good and get higher framerates.
Just like I think most people with an nVidia card will enable PhysX effects in games like Mirror's Edge and Cryostasis, just because it makes it look that bit cooler and more realistic.
In a few years we may all have hardware that supports it, and we'll just laugh at these silly discussions... "Wow, did anyone ever think that DX10/PhysX would NOT be cool in games?"


How long ago did Physx launch though? Is it still new? It's been on the market for a while now (more than 2 years, right?) and how many games support it?

When the C2D launched even AMD fan boys had to have a rig powered by it. How long did it take for hardware T&L to make any card without that obsolete? How long did it take for DX9 to completely make DX8 obsolete? Some techs take longer than others to really take off. How long has Physx been around now?

Unfair. PhysX was a small hardware shop with next to no sales and next to no software support from devs. Plainly, they were just too small. Nvidia has owned PhysX now for just a year, and already we have PhysX games. During the past year, how many tier 1, 2, 3 devs signed on to PhysX? How long do you think it takes to write a PhysX based game? It won't be long before there is a flood of PhysX games. And don't say you've been hearing this for years now, because it's only been about 1 year since Nvidia purchased Ageia. February thru April 2008 I believe from the purchase agreement to the actual closing of the deal. Point is, to have ANYTHING to show for it this soon is pretty impressive. Mirrors Edge (not for everyone, but many found it pretty cool), Cryostasis now. They'll just keep coming in now on a regular basis. It's time IMHO.

It was a question, I don't see how a question is unfair. Physx has been on the market for a while now, but I see that Nvidia purchased them just over a year ago (Feb 08). I don't know how long ago Nvidia added the ability for the PPU functions to be handled by their GPU's. According to Scali it was, "earlier this year." But I thought that it was earlier than the last 5 months, but I don't know for sure.

But as you point out, it's still early, and based on that I think it's fair of me to say that so far it is very underwhelming. Be it because it's early or something else, so far Physx does not excite me. I've said it before, and I'll say it again, maybe Physx will be come a 'must have'. Right now it isn't. Right now Physx seems to need cheerleading as it can't stand by itself it seems. Maybe in a year I'll be using a GT300 and loving my Physx experience. But as it is right now, a lot of people are like me and on the fence and not rushing to buy Physx capable hardware as it isn't 'there' yet.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky

Why, because Intel is a serious threat to AMD's GPU business? Didn't think so.

Did AMD stop selling CPUs? Did Intel stop work on Larrabee? :confused:

AMD is cutting off their nose to spite their face and it will cost them I'm sure.

There's a good reason for that. Havok has been around longer than Physx.
Since we are talking about GPU physics, who cares about Havok. That's a discussion for the "CPU" forum. Until at least one game is released using Havok on the GPU it's all smoke and mirrors.

Pot, meet kettle.

AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.

I hope OpenCL brings us new effects in games that alter the experience in a serious way.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky

Why, because Intel is a serious threat to AMD's GPU business? Didn't think so.

Did AMD stop selling CPUs? Did Intel stop work on Larrabee? :confused:

AMD is cutting off their nose to spite their face and it will cost them I'm sure.

There's a good reason for that. Havok has been around longer than Physx.
Since we are talking about GPU physics, who cares about Havok. That's a discussion for the "CPU" forum. Until at least one game is released using Havok on the GPU it's all smoke and mirrors.

Pot, meet kettle.

So you agree with me?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SlowSpyder
Originally posted by: Keysplayr
Originally posted by: SlowSpyder
Originally posted by: Scali
Originally posted by: SlowSpyder
Exactly my point of the post of mine you quoted. I am a gamer, I enjoy plaing many different types of games on my computer, yet from what I've seen of Physx, I just do not feel that I am missing out on anything, as it is right now. Will Physx be a must-have down the road? It may be. Right now it's very underwhelming.

Haven't we heard this many times before though?
Like with DX10 and Crysis for example... "Oh, you can barely tell the difference with DX9, and it's slower anyway... besides you need to get Vista".
Or with DX8 vs DX9, or SM2.0 vs SM3.0....
Basically everytime some new technology popped up.

No, most technologies aren't going to take the world by storm... But if you look back... I think DX10 and Vista/Windows 7 are now pretty accepted, and most people who have DX10 will run Crysis in DX10 mode, even though DX9 may look *almost* as good and get higher framerates.
Just like I think most people with an nVidia card will enable PhysX effects in games like Mirror's Edge and Cryostasis, just because it makes it look that bit cooler and more realistic.
In a few years we may all have hardware that supports it, and we'll just laugh at these silly discussions... "Wow, did anyone ever think that DX10/PhysX would NOT be cool in games?"


How long ago did Physx launch though? Is it still new? It's been on the market for a while now (more than 2 years, right?) and how many games support it?

When the C2D launched even AMD fan boys had to have a rig powered by it. How long did it take for hardware T&L to make any card without that obsolete? How long did it take for DX9 to completely make DX8 obsolete? Some techs take longer than others to really take off. How long has Physx been around now?

Unfair. PhysX was a small hardware shop with next to no sales and next to no software support from devs. Plainly, they were just too small. Nvidia has owned PhysX now for just a year, and already we have PhysX games. During the past year, how many tier 1, 2, 3 devs signed on to PhysX? How long do you think it takes to write a PhysX based game? It won't be long before there is a flood of PhysX games. And don't say you've been hearing this for years now, because it's only been about 1 year since Nvidia purchased Ageia. February thru April 2008 I believe from the purchase agreement to the actual closing of the deal. Point is, to have ANYTHING to show for it this soon is pretty impressive. Mirrors Edge (not for everyone, but many found it pretty cool), Cryostasis now. They'll just keep coming in now on a regular basis. It's time IMHO.

It was a question, I don't see how a question is unfair. Physx has been on the market for a while now, but I see that Nvidia purchased them just over a year ago (Feb 08). I don't know how long ago Nvidia added the ability for the PPU functions to be handled by their GPU's. According to Scali it was, "earlier this year." But I thought that it was earlier than the last 5 months, but I don't know for sure.

But as you point out, it's still early, and based on that I think it's fair of me to say that so far it is very underwhelming. Be it because it's early or something else, so far Physx does not excite me. I've said it before, and I'll say it again, maybe Physx will be come a 'must have'. Right now it isn't. Right now Physx seems to need cheerleading as it can't stand by itself it seems. Maybe in a year I'll be using a GT300 and loving my Physx experience. But as it is right now, a lot of people are like me and on the fence and not rushing to buy Physx capable hardware as it isn't 'there' yet.

Actually, I think having even one game this soon is quite a feat. It's all up to the devs you know.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SickBeast
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky

Why, because Intel is a serious threat to AMD's GPU business? Didn't think so.

Did AMD stop selling CPUs? Did Intel stop work on Larrabee? :confused:

AMD is cutting off their nose to spite their face and it will cost them I'm sure.

There's a good reason for that. Havok has been around longer than Physx.
Since we are talking about GPU physics, who cares about Havok. That's a discussion for the "CPU" forum. Until at least one game is released using Havok on the GPU it's all smoke and mirrors.

Pot, meet kettle.

AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.

I hope OpenCL brings us new effects in games that alter the experience in a serious way.

The article...... Ok, link it.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky

Why, because Intel is a serious threat to AMD's GPU business? Didn't think so.

Did AMD stop selling CPUs? Did Intel stop work on Larrabee? :confused:

AMD is cutting off their nose to spite their face and it will cost them I'm sure.

There's a good reason for that. Havok has been around longer than Physx.
Since we are talking about GPU physics, who cares about Havok. That's a discussion for the "CPU" forum. Until at least one game is released using Havok on the GPU it's all smoke and mirrors.

Pot, meet kettle.

So you agree with me?

Far from it. I'm pointing out the irony of you bringing up AMD's cpu business in a gpu-physics thread, and then dismissing Havok because no games accelerate it on a gpu yet.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: munky


Far from it. I'm pointing out the irony of you bringing up AMD's cpu business in a gpu-physics thread, and then dismissing Havok because no games accelerate it on a gpu yet.

There is no ATI, they are owned by AMD. So Intel is a competitor no matter how you slice it. Also you ignored Larrabee altogether for some reason.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
There is no ATI, they are owned by AMD. So Intel is a competitor no matter how you slice it.
My view of things is not so simplistic. When it comes to gpu-accelerated physics, Intel is no competition to AMD at all.

Also you ignored Larrabee altogether for some reason.

Because until Larrabee is an actual product you can buy, it's about as relevant to this discussion as some speculation about AMD's future gpu architecture or some rumors about games using gpu-accelerated Havok physics.
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
Originally posted by: Scali
You just get better performance/more effects if you use hardware acceleration through either an Ageia PPU or an nVidia GPU.

AFAIK that's not correct. Only a FEW games have true hardware acceleration support. Most of those games that use the PhysX API will see no benefit from a Geforce card (other than Mirror's Edge, Cryostasis, and a handful of other "games" not even worth mentioning).
 

thilanliyan

Lifer
Jun 21, 2005
12,084
2,281
126
Originally posted by: Keysplayr
Agreed!! ATI should have jumped on board. They probably still could. AFAIK, the door has always been open. Never closed.

The problem is that it's owned by nVidia (and I as well as many others could fathom nVidia using that position to sell more cards one way or another...which is their right BUT in the long run I don't think consumers would benefit and that part irks me). AFAIK something like AA is not owned by either company.

Originally posted by: Scali
Originally posted by: thilan29
Imagine if you could only run something like AA on either ATI or nV cards? That would not be cool.

There once was such a time, grasshopper...
ATi was the first with MSAA on the Radeon 9500/9700 series. nVidia only had the GeForce4, which only supported SSAA, which was too slow to actually use.
People rushed to buy the Radeon 9500/9700 simply because it was better.

Hehe, I'm not that young...first computer in 1992...but didn't buy a "gaming" card until 2001 (Geforce 2 MX400). I think the difference is that MSAA/SSAA was not owned by either company correct? So eventually both could be used without it being financially beneficial to either company and both on a level playing field (I see some of the parallels with this argument but I don't want to have to choose a card based on a gaming feature...I'd rather choose it based on price, cooling, warranty mostly). If PhysX and Havok both take off and I can use either with any card I choose to buy then I'm a happy gamer and could care less.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: munky
I-Fluid is hardly the kind of game I would spend time playing.

That has little to do with anything.

Originally posted by: munky
Yeah, and? I have an NV card, and none of those games are impressive enough for me to use GPU-physx.

Again, that has little to do with anything. The technology is there for both developers and end-users. There are currently no alternatives.

Originally posted by: munky
All Larrabee speculation aside, it's pretty obvious why AMD chose Havok and not Physx.

Is it? Care to explain then?

Originally posted by: munky
We aren't talking about Cuda, but rather the implementation of Physx based on Cuda.

And how would that be a rush-job? It does what it's supposed to do: run physics on the GPU. And it does this considerably faster than a CPU can, so it clearly accelerates the process.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: SlowSpyder
I don't know how long ago Nvidia added the ability for the PPU functions to be handled by their GPU's. According to Scali it was, "earlier this year." But I thought that it was earlier than the last 5 months, but I don't know for sure.

I forgot that it was added to the G80 series at a later point.
The first drivers for GPU PhysX came out in June 2008, but for G92 only. I think it's the G80-support that was released either late 2008 or early 2009.
But all in all, it's still less than a year ago.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: thilan29
AFAIK that's not correct. Only a FEW games have true hardware acceleration support. Most of those games that use the PhysX API will see no benefit from a Geforce card (other than Mirror's Edge, Cryostasis, and a handful of other "games" not even worth mentioning).

There are two reasons why that could be:
1) The developer chose to always use the CPU for PhysX.
2) The physics workload is so light that it is insignificant to the overall performance, and as such you won't notice a difference between CPU, GPU and PPU performance.

That has nothing to do with PhysX itself, but rather with choices that the developers made when writing the game.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: thilan29
The problem is that it's owned by nVidia (and I as well as many others could fathom nVidia using that position to sell more cards one way or another...which is their right BUT in the long run I don't think consumers would benefit and that part irks me).

What amazes me is that it somehow is not a problem that Havok is owned by Intel.

Originally posted by: thilan29
Hehe, I'm not that young...first computer in 1992...but didn't buy a "gaming" card until 2001 (Geforce 2 MX400). I think the difference is that MSAA/SSAA was not owned by either company correct? So eventually both could be used without it being financially beneficial to either company and both on a level playing field (I see some of the parallels with this argument but I don't want to have to choose a card based on a gaming feature...I'd rather choose it based on price, cooling, warranty mostly). If PhysX and Havok both take off and I can use either with any card I choose to buy then I'm a happy gamer and could care less.

The thing is, it is IMPOSSIBLE to make PhysX work on OpenCL currently.
We don't know if nVidia plans to do this. What we do know is that there's a painfully obvious reason why it doesn't work on ATi hardware yet: there IS no OpenCL.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Azn
I don't know about MSAA being better. It was just faster.

It's better because you could actually USE it :)
Before MSAA, hardly anyone ever used AA at all, because it just killed your framerates, and games were no longer playable.
The same goes for AF by the way, that was also first usable on the Radeon 9500/9700 series.

A purist might say that SSAA can look better, but back then it simply wasn't an option. Even today full-on SSAA is still a problem. We now have some nifty hybrid modes.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: Scali
Funny you should say that, because the OpenGL and Direct3D APIs have for a large part dominated how GPUs were designed and how they evolved (early GPUs were hopeless in OpenGL... nVidia was one of the first to have a GPU that could run most OpenGL code efficiently).
GPUs are designed to run OpenGL and Direct3D operations efficiently.
GPGPUs can be designed to run OpenCL operations efficiently.
nVidia's GPGPU is designed to run Cuda efficiently, and OpenCL is very similar to Cuda.
ATi's GPGPU is designed to run entirely different code efficiently, not Cuda, and as such not OpenCL or DX11 CS.

You have that backwards... CUDA was designed to run on Nvidia hardware efficiently. Just as Stream was designed to run on ATI hardware efficiently.

OpenCL isn't designed to run on any particular platform at all. OpenCL is NOT CUDA. Just because you can draw similarities between the two doesn't make it so. Otherwise, you might as well say Stream is basically CUDA (which you've already said it is not).

OpenCL will sit atop of Stream just as well as it will sit atop of CUDA. Some of the arbitrary constructs may appear similar to their CUDA/Stream counterparts, but do not make the mistake of saying OpenCL was designed around CUDA, because you would be flat out wrong.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: thilan29
Originally posted by: Keysplayr
Agreed!! ATI should have jumped on board. They probably still could. AFAIK, the door has always been open. Never closed.

The problem is that it's owned by nVidia (and I as well as many others could fathom nVidia using that position to sell more cards one way or another...which is their right BUT in the long run I don't think consumers would benefit and that part irks me). AFAIK something like AA is not owned by either company.

You mean, something similar to what Intel is going to do.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SunnyD
Originally posted by: Scali
Funny you should say that, because the OpenGL and Direct3D APIs have for a large part dominated how GPUs were designed and how they evolved (early GPUs were hopeless in OpenGL... nVidia was one of the first to have a GPU that could run most OpenGL code efficiently).
GPUs are designed to run OpenGL and Direct3D operations efficiently.
GPGPUs can be designed to run OpenCL operations efficiently.
nVidia's GPGPU is designed to run Cuda efficiently, and OpenCL is very similar to Cuda.
ATi's GPGPU is designed to run entirely different code efficiently, not Cuda, and as such not OpenCL or DX11 CS.

You have that backwards... CUDA was designed to run on Nvidia hardware efficiently. Just as Stream was designed to run on ATI hardware efficiently.

OpenCL isn't designed to run on any particular platform at all. OpenCL is NOT CUDA. Just because you can draw similarities between the two doesn't make it so. Otherwise, you might as well say Stream is basically CUDA (which you've already said it is not).

OpenCL will sit atop of Stream just as well as it will sit atop of CUDA. Some of the arbitrary constructs may appear similar to their CUDA/Stream counterparts, but do not make the mistake of saying OpenCL was designed around CUDA, because you would be flat out wrong.

Ummm..... What?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Scali
That has little to do with anything.
It means people aren't going to choose NV over AMD just because some no-name developer makes a crappy game with Physx.

Again, that has little to do with anything. The technology is there for both developers and end-users. There are currently no alternatives.
Again, unless blockbuster games like Crysis or Farcry 2 begin using gpu-accelerated Physx, the technology will have little to do with anything that matters.

Is it? Care to explain then?
I already did.

And how would that be a rush-job? It does what it's supposed to do: run physics on the GPU. And it does this considerably faster than a CPU can, so it clearly accelerates the process.

It runs PhysX on Nvidia's gpu only, and only those that support Cuda. Not to mention it actually lowers performance when the same gpu is doing the rendering and the physics processing. It's hardly a surprise that few people care about it.

Cuda set the standard for GPGPU, and OpenCL is based on the Cuda model.
Care to explain that? Or are you just repeating blanket statements originating from Nvidia marketing?