"Inevitable Bleak Outcome for nVidia's Cuda + Physx Strategy"

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: SSChevy2001
I don't see Tim Sweeney making any excuses.

Tim Sweeney is responsible for the engine used in Mirror's Edge, so what's your point?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Scali
He said: "When not using 10.1". I think you owe him an apology.

Your right. :eek:

My bad, abit too many late nights have seriously degraded my reading skills :p
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: Scali
Originally posted by: SSChevy2001
I don't see Tim Sweeney making any excuses.

Tim Sweeney is responsible for the engine used in Mirror's Edge, so what's your point?

Hehe, that's the thought that instantly hit my mind-set. This opposition to PhysX is beyond strange to me and when we they oppose it, well, the posts speak for themselves.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Keysplayr
At two or three times the developement time, doesn't sound like he's actually "loving" it either. And it would appear that not all parts of the game are multithreaded. Just the ones' he had mentioned here.

Yea, besides, not everything CAN be multithreaded.
That's the point I couldn't be bothered to explain before, and still really can't... but I'll give just a few small hints.
One thing is that Direct3D itself is not multithreaded. So there will always be one thread handling graphics commands. So there is always some time during the rendering of the frame, where only one thread is running, and the others just have to wait until they can start their next job.
Since a well-written program won't just leave all threads spinning their wheels while waiting, they will use some OS functionality to temporarily put the threads into a sleeping state, freeing the cores up for other uses. Hence, you will NEVER see 100% CPU usage.
That has absolutely NO bearing on how efficient these cores are used when they are actually running.

Or imagine for example when you need to load something from disk. You only want to load it once, so you only want a single thread to do that. That has nothing to do with how efficiently the application is multithreaded. It's just that loading something from disk is inherently a serial operation, and using multiple threads makes no sense.
So, the threads will again have to be in an idle state, waiting to be woken up when the data is loaded and they can get to work.

In other words: anyone who thinks the CPU usage in Task Manager says anything about how efficiently an application makes use of your cores, is an idiot (well, I think that's what Carmack would say).
In fact, an application that uses 100% on all cores is likely to actually be INEFFICIENT, because it is most probably wasting processing time while waiting for other hardware. This especially holds true for games. There are some applications that can reach close to 100% CPU usage because they don't access any hardware during most of the processing, such as an raytracer or a video encoder. But games, which have lots of interaction with videocards, soundcards, input devices and such, will likely not be able to keep all cores busy all the time with useful processing tasks. There are moments where they have to wait for hardware to complete input or output.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Scali
Originally posted by: SSChevy2001
I don't see Tim Sweeney making any excuses.

Tim Sweeney is responsible for the engine used in Mirror's Edge, so what's your point?
My point is UE4 is in to works to avoid single-threaded code. Instead of making excuses, he's working on a new engine that will be ready for future CPUs.

Mirror's Edge is a title made to market GPU PhysX. Everything but glass breaking works fine with PhysX running on my CPU, but there's no option to just enable cloth effects, why is that? IMO the extra glass effect in this title is useless anyway, as you don't have time to really look at glass on the floor.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: SSChevy2001
My point is UE4 is in to works to avoid single-threaded code. Instead of making excuses, he's working on a new engine that will be ready for future CPUs.

My point is that you need to read my post above.

Originally posted by: SSChevy2001
Mirror's Edge is a title made to market GPU PhysX.

Oh yea sure. That must be why they also released it on consoles first, with no GPU PhysX, and why you can run the game just fine on a PC without GPU PhysX support.
You could give the developers of Mirror's Edge a bit more credit than that.
Yes obviously they agreed to let nVidia add some GPU PhysX effects. That however doesn't mean that the sole purpose and all the hard work the developers put in was just to sell nVidia products. If I were one of Mirror's Edge developers, I'd be deeply insulted.

Originally posted by: SSChevy2001
Everything but glass breaking works fine with PhysX running on my CPU, but there's no option to just enable cloth effects, why is that? IMO the extra glass effect in this title is useless anyway, as you don't have time to really look at glass on the floor.

Oh yea, effects that don't work on your PC are useless, we've all heard that before... You think you're actually going to convince anyone? How dumb do you think we are? Really, people should just stop posting that sort of garbage.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
@Scali

I've read your post, but developers need to continue to try and make the most of these extra cores. What do you think PS3 developers are doing with the extra SPEs?

One of DX11 features is multithreaded rendering.

I could give a rats ass what EA thinks of my comments, with yearly repeat of their sport titles their doing just fine. My point is Mirror's Edge is a good example of a title were PhysX effects where removed from CPU to market the need for GPU PhysX. It's not like EA going to say "wow you only get better glass breaking effect with GPU PhysX".

As far as my machine goes, I have a perfectly fine 8800GTS G92 I can pop in with my 4870 if I really need to play GPU PhysX. That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: SSChevy2001
@Scali

I've read your post, but developers need to continue to try and make the most of these extra cores. What do you think PS3 developers are doing with the extra SPEs?

What makes you think developers aren't already doing that?
You DO understand that Task Manager doesn't mean anything, right? I mean, I didn't make that long post for nothing, I hope.
Both PhysX and Havok are already highly optimized for multicore systems. So at any rate it's not going to be the physics that aren't properly optimized.
And as Keysplayr already posted, UnrealEngine3 itself is also optimized for multicore. Just like most other major engines.

The bottom line is: multicore processors don't work as well as laymen think they do. You might want to go to Wikipedia and read about Amdahl's Law to see why that might be.

Originally posted by: SSChevy2001
One of DX11 features is multithreaded rendering.

I know, I'm working on converting my engine to DX11 as we speak.
But what does that have to do with current games, which can't make use of DX11?
Aside from that, even though DX11's attempt at multithreaded rendering is nice, it still doesn't solve the problem completely. You can just create renderlists in parallel now. They are still sent to the hardware by a single thread.

Originally posted by: SSChevy2001
My point is Mirror's Edge is a good example of a title were PhysX effects where removed from CPU to market the need for GPU PhysX.

EA didn't remove anything. The console versions of Mirror's Edge are the same as the PC version without GPU PhysX.
They just *added* GPU PhysX effects for nVidia cards.

Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.

It also doesn't change the fact that I don't care about your opinion.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.

Just as useless as AA/AF/HDR, heck even trees, clothes, etc.

Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Keysplayr

At two or three times the developement time, doesn't sound like he's actually "loving" it either. And it would appear that not all parts of the game are multithreaded. Just the ones' he had mentioned here.


It is definitely not fun working with the current way of implementing multithreading.
I don't think most realize why it is such an issue. If you have core 0 doing AI and core 1 processing the sound, it would be a wonderful world if they always finished at the same time. Of course they don't so you get a situation where core 0 is done, but core 1 isn't finished, so now the AI is waiting for the sound or they could just display it as is and it would look like one of those badly dubbed foreign movies !

Might be fun ! Characters shooting, then you hear the sound effect a second later, the enemy appears on screen, you press fire, then the enemy is gone, you see the bullet shoot out , , the sound plays, and then the bullet hits a target whose sound you hear about 30 seconds later and your character reacts to firing the gun!


Even non realtime applications are having problems. I've been on Autodesk case for quite a while about their cloth simulations. You can have 16 cores and the simulation will at most use 2 of them. They said they just can't get the threads to work properly.

We really needed transmeta to succeed. Shame they didn't.

I think the current situation with SMP is a result of the idea that we would keep pushing the Mhz and it wouldnt be that big a deal. Unfortunately what we have now is not optimal at all.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Scali
What makes you think developers aren't already doing that?
You DO understand that Task Manager doesn't mean anything, right? I mean, I didn't make that long post for nothing, I hope.
Both PhysX and Havok are already highly optimized for multicore systems. So at any rate it's not going to be the physics that aren't properly optimized.
And as Keysplayr already posted, UnrealEngine3 itself is also optimized for multicore. Just like most other major engines.

The bottom line is: multicore processors don't work as well as laymen think they do.
You might want to go to Wikipedia and read about Amdahl's Law to see why that might be.
Yes I know what taskmgr is. While I'm not a developer, but I can understand that you just can't break up a game to run 100% on a quad core. What I expect to see is more titles favoring quad core cpus by more than just by 5%.

Originally posted by: Scali
EA didn't remove anything. The console versions of Mirror's Edge are the same as the PC version without GPU PhysX.
They just *added* GPU PhysX effects for nVidia cards.
Agreed. Only some people will make like these extra cloth, smoke, and debris effect can't run on current CPUs, which is not the case.

Originally posted by: Scali
It also doesn't change the fact that I don't care about your opinion.
I could care less what you like. It's a fast paced running game which leaves little time to enjoy some extra glass on the floor. What's ironic is it's the only effect in the game that causes CPUs to crawl.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.

Just as useless as AA/AF/HDR, heck even trees, clothes, etc.

Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Keysplayr
Originally posted by: Lonyo
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.

I said that they're not too far off, not that they are the same. :light:

It would be nice if you would actually read my posts and take what I say at face value.

The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.

Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?

BTW: I read your post. And this is the face value I got from it.

AA doesn't alter gameplay, but would you buy a card without it? Nahh...

Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.

Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?

Ah, thank you Lonyo. I was hoping somebody would go there. But I see that Wreckage and also Scali had some insight on this theory of yours. And I'd like to add, that both ATI cards and Nvidia cards perform AA. This is not a debate on whether or not these cards can perform AA, it was a metaphor stating, if one did DID perform AA and the other DID NOT AT ALL, which would you choose. Even though AA DOESN'T alter game content and behavior. You'd still want to have it. I don't care what planet your from. Can you see the purpose of the analogy now? So, dude, your kind of helping me with my points than hindering me. Kudos.

Look, PhysX is a really cool idea and can produce some incredible effects that are way more impressive than AA in my books. The problem is that AA is supported by all hardware and works with all games. PhysX only runs on NV hardware and is supported by a handful of games.

I can see why you would make the comparison, Keys. PhysX is some really amazing eye candy. CUDA will more than likely do some absolutely incredible things as well. The problem is that as a closed platform, CUDA and PhysX will never fully catch on. Please tell your NV masters to try to get PhysX established as a universal standard, or else cave and go with Havok if that's what AMD is using. We need a standard.

I don't think anyone is arguing the value of hardware physics and GPGPU applications. We're just saying that without an open standard it's pretty pointless if you look at the big picture.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SSChevy2001
Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.

Just as useless as AA/AF/HDR, heck even trees, clothes, etc.

Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:

I remember when ATI fans were talking about adding AA to HDR was the greatest thing in the history of gaming. :laugh:

The hypocrisy is so thick here.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: SSChevy2001
Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.

Just as useless as AA/AF/HDR, heck even trees, clothes, etc.

Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:

I think those PhysX effects are valuable enough. It would be better if AMD hardware could run them, though.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Wreckage
Originally posted by: SSChevy2001
Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.

Just as useless as AA/AF/HDR, heck even trees, clothes, etc.

Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:

I remember when ATI fans were talking about adding AA to HDR was the greatest thing in the history of gaming. :laugh:

The hypocrisy is so thick here.

For once I agree with you, Wreckage. I don't see the point in downplaying the eye candy that PhysX can bring. The only fair criticism at this point lies in the fact that PhysX is not a universal standard dispite your assertations that it can run on Linux (which is pointless for gaming at this point).
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SickBeast
We're just saying that without an open standard it's pretty pointless if you look at the big picture.

So you are saying we should wait 2 or more years for ATI to get their ship together? No thank you. If they can't keep up they should be left behind.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SickBeast
Originally posted by: Keysplayr
Originally posted by: Lonyo
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


AMD has apparently been able to get Havok up and running on their GPUs using OpenCL if you actually read the article, Wreckage. That's not to far off from what NV has with PhysX at this point.
I see so a demo that no one but they can see = released games on the market?

Sure that sounds the same :roll:

Now I know you are trolling.

I said that they're not too far off, not that they are the same. :light:

It would be nice if you would actually read my posts and take what I say at face value.

The stuff that NV has done so far has not altered gameplay in any way. IMO there is not much difference between a demo and a few added effects that do not alter gameplay.

Some more PhysX games are not too far off as well. Does that count? Or is that only a perk for ATI and Havok?

BTW: I read your post. And this is the face value I got from it.

AA doesn't alter gameplay, but would you buy a card without it? Nahh...

Since you and Wreckage both like AA a lot, and (IIRC) DX10.1 allows much faster AA (or can do) then surely NV are the worst company because they don't support DX10.1 so they can't have AA as high because it would slow the frame rate down too much.
Who cares if it only applies to a handful of titles, it improves graphical quality and/or frame rate.

Why buy a card with PhysX but no DX10.1 over a card with DX10.1 but no PhysX? Surely they both have the same value?

Ah, thank you Lonyo. I was hoping somebody would go there. But I see that Wreckage and also Scali had some insight on this theory of yours. And I'd like to add, that both ATI cards and Nvidia cards perform AA. This is not a debate on whether or not these cards can perform AA, it was a metaphor stating, if one did DID perform AA and the other DID NOT AT ALL, which would you choose. Even though AA DOESN'T alter game content and behavior. You'd still want to have it. I don't care what planet your from. Can you see the purpose of the analogy now? So, dude, your kind of helping me with my points than hindering me. Kudos.

Look, PhysX is a really cool idea and can produce some incredible effects that are way more impressive than AA in my books. The problem is that AA is supported by all hardware and works with all games. PhysX only runs on NV hardware and is supported by a handful of games.

I can see why you would make the comparison, Keys. PhysX is some really amazing eye candy. CUDA will more than likely do some absolutely incredible things as well. The problem is that as a closed platform, CUDA and PhysX will never fully catch on. Please tell your NV masters to try to get PhysX established as a universal standard, or else cave and go with Havok if that's what AMD is using. We need a standard.

I don't think anyone is arguing the value of hardware physics and GPGPU applications. We're just saying that without an open standard it's pretty pointless if you look at the big picture.

Just about anything that is a "standard" today, started out proprietary. Then others joined in an it became a standard. x86 Architecture originated from one single company. Others licensed it and followed up with it. Now it's standard. AA was introduced by a single company. It was adopted and now standard.
You only see Nvidia as a company who isn't trying to push new technologies out there. Who isn't trying to innovate. Who isn't trying to improve the way things are done. Which company do you think better fits that description these days?

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SickBeast
Originally posted by: SSChevy2001
Originally posted by: Wreckage
Originally posted by: SSChevy2001
That still doesn't change the fact that extra glass breaking effect in Mirror's Edge is IMO useless.

Just as useless as AA/AF/HDR, heck even trees, clothes, etc.

Atari nailed it with the 2600 and everything since has just been useless fluff. :roll:
For a fast paced title like Mirror's Edge, yes it's useless. Could you really see Nvidia trying to sell GPU PhysX with just that effect? :roll:

I think those PhysX effects are valuable enough. It would be better if AMD hardware could run them, though.

They probably can't. We talked about a theory in this thread, that ATI knew it could not run PhysX on it's GPUs, and quickly turned it down and signed on with Havok. This was how long ago? I dunno. The thing is, I don't think ATI can run any sort of Physics on it's GPUs.
Unless they are severely limited with staff and programmers, we should have seen a whole lot of physics on their GPU's by now. Like we said earlier in the thread. This is just ATI postponing the inevitable IMHO. Please let me be wrong.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I'm pretty sure that some people actually managed to hack PhysX to run on AMD GPUs a while back. Please correct me if I'm wrong. I think I remember reading that it actually ran faster on the AMD GPUs as well.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SickBeast
I'm pretty sure that some people actually managed to hack PhysX to run on AMD GPUs a while back. Please correct me if I'm wrong. I think I remember reading that it actually ran faster on the AMD GPUs as well.

One programmer, with Nvidia's assistance, was able to get PhysX to run on a 3series ATI GPU.
But where did you get the idea that it was faster? Send a link if you have one. Anyway, the effort was abandoned after an unsupportive ATI declined to help the programmer.

When I say, I don't think ATI can run PhysX on their GPUs, I mean "physics" in general, whether it's PhysX, Havok, Bullet or whatever is there, properly or fast enough. IMHO.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,116
616
136
Originally posted by: Keysplayr


They probably can't. We talked about a theory in this thread, that ATI knew it could not run PhysX on it's GPUs, and quickly turned it down and signed on with Havok. This was how long ago? I dunno. The thing is, I don't think ATI can run any sort of Physics on it's GPUs.
Unless they are severely limited with staff and programmers, we should have seen a whole lot of physics on their GPU's by now. Like we said earlier in the thread. This is just ATI postponing the inevitable IMHO. Please let me be wrong.

While none of us can be certain. I think that you may have it there. Would they have been able to devote the man hours necessary to get Physx working on their cards, or was it deemed a dilution of their resources. We all know that they are capable of running gpgpu applications on their cards, it all comes down to the amount of resources they would have had to throw at it to get it functional.

It may be better in the long run to let nvidia test the waters with their gpu/physics and see how well it takes hold, rather than wasting resources on something that may not take off.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Keysplayr
Originally posted by: SickBeast
I'm pretty sure that some people actually managed to hack PhysX to run on AMD GPUs a while back. Please correct me if I'm wrong. I think I remember reading that it actually ran faster on the AMD GPUs as well.

One programmer, with Nvidia's assistance, was able to get PhysX to run on a 3series ATI GPU.
But where did you get the idea that it was faster? Send a link if you have one. Anyway, the effort was abandoned after an unsupportive ATI declined to help the programmer.

When I say, I don't think ATI can run PhysX on their GPUs, I mean "physics" in general, whether it's PhysX, Havok, Bullet or whatever is there, properly or fast enough. IMHO.

It's probably already been raised and discussed, so forgive me if this is tiresome to read again, but it may be more of an IP issue than a "is their any technical reason why it hasn't been done?" type issue.

ATI might still be trying to carve out some IP space at the moment, it takes time when you aren't the first. Even if they don't truly intend to implement their IP, they may want it as leverage for negotiation when they seek to x-license IP with NV...IP that they actually do want to implement.

These things take time once they are set in motion. I've been sent on more than one "IP generation expedition" where the stated purpose was specifically to create leverage for the negotiation of a license to a competitor's IP which we really needed to gain access "at the right price or else its a deal killer".

Or it could be that AMD really thinks its got too many pots boiling already and per their strategy and vision they don't think adding this particular pot to the kitchen is going to be critically enabling at this time.

Look at Hi-k metal-gate integration, could AMD have implemented it at 45nm? Yes, provided they made the decision to do so in 2003 at the very early beginnings of 45nm development. Did they need to implement HK/MG at 45nm to field a competitive part? Not really, if anything they needed both PhII and 45nm (as it is, no further improvements) about a year sooner and it would have been a smash hit.

So there are (IMO) plenty of legitimate reasons for AMD to be turning a blind eye to GPU physics/physx up till now and into the near future. I just hope for their sake their decision to ignore physx for now is not causing harm to any consumers in Europe. :laugh: (that's a JOKE people, put down the pitchforks and torches...)