"Inevitable Bleak Outcome for nVidia's Cuda + Physx Strategy"

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: SunnyD
The only embarrassing thing here is the misinformation being spread. But then again, I suppose that the assumption is the bulk of the people that visit the forum are like your typical Best Buy consumer and wouldn't know a video card from a blender, let alone what an API is.

Even a Best Buy consumer can go to Wikipedia:
http://en.wikipedia.org/wiki/API

Besides, it is YOUR understanding of what an API is that is in question, not of anyone else in this thread.
Clearly, by the definition of API from Wikipedia, the API is only a part of Cuda, not the total.
Cuda consists of:
1) The GPU engine (hardware)
2) The instruction set
3) The driver API
4) Various programming languages to write kernels (including C for Cuda)

OpenCL replaces 3) and 4), while 1) and 2) remain unchanged in Cuda.

In other words: you're wrong.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky

Why, because Intel is a serious threat to AMD's GPU business? Didn't think so.

Did AMD stop selling CPUs? Did Intel stop work on Larrabee? :confused:

AMD is cutting off their nose to spite their face and it will cost them I'm sure.

There's a good reason for that. Havok has been around longer than Physx.
Since we are talking about GPU physics, who cares about Havok. That's a discussion for the "CPU" forum. Until at least one game is released using Havok on the GPU it's all smoke and mirrors.

Pot, meet kettle.

AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.

I hope OpenCL brings us new effects in games that alter the experience in a serious way.

The article...... Ok, link it.

I realize that you're here to do PR work for NV, but I shouldn't have to link an article for you that is directly linked in the OP.

If you guys are going to comment in a thread, the least you could do would be to read the OP and as a bare minimum, skim over any articles which are linked.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Wreckage
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.

I said that they're not too far off, not that they are the same. :light:

It would be nice if you would actually read my posts and take what I say at face value.

The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SickBeast
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: munky

Why, because Intel is a serious threat to AMD's GPU business? Didn't think so.

Did AMD stop selling CPUs? Did Intel stop work on Larrabee? :confused:

AMD is cutting off their nose to spite their face and it will cost them I'm sure.

There's a good reason for that. Havok has been around longer than Physx.
Since we are talking about GPU physics, who cares about Havok. That's a discussion for the "CPU" forum. Until at least one game is released using Havok on the GPU it's all smoke and mirrors.

Pot, meet kettle.

AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.

I hope OpenCL brings us new effects in games that alter the experience in a serious way.

The article...... Ok, link it.

I realize that you're here to do PR work for NV, but I shouldn't have to link an article for you that is directly linked in the OP.

If you guys are going to comment in a thread, the least you could do would be to read the OP and as a bare minimum, skim over any articles which are linked.

You realize little. I actually thought you were referring to another article, not the one in the OP. My bad. But you really don't have to be rude about it.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.

I said that they're not too far off, not that they are the same. :light:

It would be nice if you would actually read my posts and take what I say at face value.

The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.

Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?

BTW: I read your post. And this is the face value I got from it.

AA doesn't alter gameplay, but would you buy a card without it? Nahh...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Scali
Originally posted by: munky
Great, so does software rendering.
What's your point?
My point is that PhysX can run on CPU just like Havok. Some people still don't seem to understand that, and post nonsense that PhysX only runs on nVidia GPUs.
The GPU/PPU acceleration are purely optional features, which Havok doesn't have. In no way is that an advantage for Havok.

If you're gonna claim x86 cpu's are too slow to run physics acceptably, then why bring them up as an alternative platform? If a person is looking to have the extra physics effects as offered on the gpu, then x86 cpu's are not an option.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.

I said that they're not too far off, not that they are the same. :light:

It would be nice if you would actually read my posts and take what I say at face value.

The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.

Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?

BTW: I read your post. And this is the face value I got from it.

AA doesn't alter gameplay, but would you buy a card without it? Nahh...

Sorry what did you take from my post? Are you saying that you and Wreckage are the same person? :Q

Or are you simply agreeing with him? It sounded like you were replying for him.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: munky
If you're gonna claim x86 cpu's are too slow to run physics acceptably, then why bring them up as an alternative platform?

I didn't say you can't do any physics on the CPU at all. Should be pretty obvious, since games have been doing physics on the CPU for years. Everyone knows that.
You CAN do physics on the CPU, and both PhysX and Havok allow you to do that.
But in both cases, the CPU's performance is going to limit the amount of effects you will be able to use. Therefore, effects like cloth, water, softbodies etc will not be an option. Which is why we haven't seen them in any games without accelerated physics, regardless of whether the games used Havok, PhysX, or some other API.

Originally posted by: munky
If a person is looking to have the extra physics effects as offered on the gpu, then x86 cpu's are not an option.

Exactly, which means that currently Havok is not an option.
If you don't want to use the extra physics, then you could still use PhysX on an x86 CPU instead of Havok. The extra physics effects aren't the only thing that PhysX does.

Again, I was responding to the claim that PhysX would only run on nVidia GPUs. This simply is not true. If you use PhysX on an x86 CPU, it's still a good alternative to Havok on CPU. PhysX just allows you the OPTION of GPU/PPU acceleration and extra effects. Havok doesn't.

I didn't think it would be THAT hard to understand. And I am amazed at how selectively my posts are read, and how they are only partially understood or pulled out of context. It seems deliberate (I give you the benefit of the doubt that you're not really THAT thick. Sadly that means that I think you are trolling and being obnoxious on purpose).
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Wreckage
Originally posted by: Modelworks


That is your opinion, and when you become a developer or have worked on published titles, I'll value it.

Wow that did not sound arrogant at all.

The problem is that you are looking at it from a gamers perspective. You want all this great stuff in games, and when you see something like PhysX you start thinking that now games can have awesome physics. I look at it from the developers point of view. I know that game physics is not a priority. It takes a back seat to things like game flow, character design and overall game play.

Gamers start thinking that something like PhysX is a push the button and you have wonderful physics. It would be great if developing for a game we could spend hundreds of hours on everything that we want. But that is not being realistic. You have to prioritize and that is why PhysX comes next to last. It just is not that important. Where as something like euphoria has a much bigger impact on the overall game and without requiring extra hardware to do it.

The reason euphoria is more exciting is because it adds something to game development that has never been available before. You could not do it before if you had 100 cpu cores at your disposal. The software was not there. Physics has always been available for games and the current cpu are more than capable of doing what is required for current games.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Modelworks
Physics has always been available for games and the current cpu are more than capable of doing what is required for current games.

That's pretty obvious, since the physics content in games was designed to run well enough on current CPUs.
Although... if you blow up enough stuff at the same time in a game like Crysis with the physics to the highest settings, you'll still see many current CPUs struggling and getting single-digit framerates.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.

I said that they're not too far off, not that they are the same. :light:

It would be nice if you would actually read my posts and take what I say at face value.

The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.

Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?

BTW: I read your post. And this is the face value I got from it.

AA doesn't alter gameplay, but would you buy a card without it? Nahh...

Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.

Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Lonyo


Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.

Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?

Yes but luckily NVIDIA cards are faster in general so it more than makes up for the difference. Not to mention when not using 10.1 then there is zero performance increase meaning you should have just bought the faster card.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Lonyo
Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.

Actually, nVidia DOES support the improved AA techniques that DX10.1 offers through a driver extension. For example, Far Cry 2 uses DX10.1 if possible, or the NVAPI on compatible nVidia cards. So you're not missing out on the AA.

Aside from that, what you're proposing is a fallacy. Two wrongs don't make a right.
It would be better if nVidia supported DX10.1 aswell, just like it would be better if ATi supported PhysX.

But nVidia will support DX10.1 soon enough, when their first DX11 cards arrive (possibly even by the end of this year).
Then you can run both DX10.1 and PhysX on the same card.
It remains to be seen what ATi will come up with in terms of physics.

I personally think that hardware-accelerated physics has a lot more potential than slightly improved AA performance.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Lonyo
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.

I said that they're not too far off, not that they are the same. :light:

It would be nice if you would actually read my posts and take what I say at face value.

The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.

Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?

BTW: I read your post. And this is the face value I got from it.

AA doesn't alter gameplay, but would you buy a card without it? Nahh...

Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.

Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?

Ah, thank you Lonyo. I was hoping somebody would go there. But I see that Wreckage and also Scali had some insight on this theory of yours. And I'd like to add, that both ATI cards and Nvidia cards perform AA. This is not a debate on whether or not these cards can perform AA, it was a metaphor stating, if one did DID perform AA and the other DID NOT AT ALL, which would you choose. Even though AA DOESN'T alter game content and behavior. You'd still want to have it. I don't care what planet your from. Can you see the purpose of the analogy now? So, dude, your kind of helping me with my points than hindering me. Kudos.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Scali
Originally posted by: Modelworks
Physics has always been available for games and the current cpu are more than capable of doing what is required for current games.

That's pretty obvious, since the physics content in games was designed to run well enough on current CPUs.
Although... if you blow up enough stuff at the same time in a game like Crysis with the physics to the highest settings, you'll still see many current CPUs struggling and getting single-digit framerates.
Crysis and Crysis Warhead mostly only use about 2 cores worth of CPU power, of which how much of that is really going towards the physic engine?
http://www.anandtech.com/cpuch...owdoc.aspx?i=3559&p=10

Mirror's Edge with PhysX enabled on CPU is the same story. Breaking glass would cause my CPU to max at 50%.

IMO we need more titles that really push quad core cpus.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: SSChevy2001
Crysis and Crysis Warhead mostly only use about 2 cores worth of CPU power, of which how much of that is really going towards the physic engine?
http://www.anandtech.com/cpuch...owdoc.aspx?i=3559&p=10

Mirror's Edge with PhysX enabled on CPU is the same story. Breaking glass would cause my CPU to max at 50%.

IMO we need more titles that really push quad core cpus.

I'd like to quote John Carmack on this one:
"it's not ?oh just thread your application?. Anyone that says that is basically an idiot"
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Cookie Monster
Originally posted by: Wreckage
Not to mention when not using 10.1 then there is zero performance increase meaning you should have just bought the faster card.

Wrong

When AA is used, alot of performance gains can be had when using DX10.1.

He said: "When not using 10.1". I think you owe him an apology.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: Wreckage
Originally posted by: Lonyo


Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.

Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?

Yes but luckily NVIDIA cards are faster in general so it more than makes up for the difference. Not to mention when not using 10.1 then there is zero performance increase meaning you should have just bought the faster card.

Luckily AMD cards are pretty much equal and sometimes even faster. Not to mention most games do not have PhysX, hence you're not really missing out - no need to pay more for an nVidia card, just get a Radeon.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Scali
Originally posted by: SSChevy2001
Crysis and Crysis Warhead mostly only use about 2 cores worth of CPU power, of which how much of that is really going towards the physic engine?
http://www.anandtech.com/cpuch...owdoc.aspx?i=3559&p=10

Mirror's Edge with PhysX enabled on CPU is the same story. Breaking glass would cause my CPU to max at 50%.

IMO we need more titles that really push quad core cpus.

I'd like to quote John Carmack on this one:
"it's not ?oh just thread your application?. Anyone that says that is basically an idiot"
I don't see Tim Sweeney making any excuses.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SSChevy2001
Originally posted by: Scali
Originally posted by: SSChevy2001
Crysis and Crysis Warhead mostly only use about 2 cores worth of CPU power, of which how much of that is really going towards the physic engine?
http://www.anandtech.com/cpuch...owdoc.aspx?i=3559&p=10

Mirror's Edge with PhysX enabled on CPU is the same story. Breaking glass would cause my CPU to max at 50%.

IMO we need more titles that really push quad core cpus.

I'd like to quote John Carmack on this one:
"it's not ?oh just thread your application?. Anyone that says that is basically an idiot"
I don't see Tim Sweeney making any excuses.

"AnandTech: The new Unreal Engine 3 is designed for multi-threading, and will make good use of dual core CPUs available when games on the new engine come out. What parts of the game will benefit/be improved, thanks to multiprocessing? What will be the parts that will benefit the most?

Tim Sweeney: For multithreading optimizations, we're focusing on physics, animation updates, the renderer's scene traversal loop, sound updates, and content streaming.We are not attempting to multithread systems that are highly sequential and object-oriented, such as the gameplay.
Implementing a multithreaded system requires two to three times the development and testing effort of implementing a comparable non-multithreaded system, so it's vital that developers focus on self-contained systems that offer the highest effort-to-reward ratio."

At two or three times the developement time, doesn't sound like he's actually "loving" it either. And it would appear that not all parts of the game are multithreaded. Just the ones' he had mentioned here.