• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

AMD's take on GPU physics

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
Originally posted by: BlueAcolyte
Originally posted by: shangshang
hey Keysplayr2003,

I wanna say that since I've been actively posting on AT (lurker before), you have impressed me with your ability to
1) stay on topic
2) not quickly retort back at a caustic post
3) support what you say with data however the data may be
4) present your answers in logical steps
5) don't get into a pissing match

anyway, back to topic...

Of course AMD will dumb down GPU physicx because NV is currently owning this space. I don't even know why people would over analyze a company (ATI) dumbing down a competor's (NV) product or feature.

And Physx is not to be compared with DX 10, 10.1, 11, etc.. Physx is an add-on, with out which games will still play on Windows. DirectX, on the other hand, is a complete API, without which, most games will not play on Windows (gone are the days of OpenGL games on Windows). Since it's an add-on feature, buy it if you like it, don't buy it if you don't care. It's that simple!

I remember the early days of Creative's EAX. EAX started out as an "add-on". You could still play games with sound without EAX, but with EAX, you get all the "ear candies" (bullets bouncing off the wall). And in the beginning, there were plenty of doubters, haters, of Creative... saying no developer is going to support EAX (because Creative is evil), but as we can see today, many serious games will support it. How long will EAX last? nobody knows, probably not forever,.. but Physx could become for NV like EAX for Creative. I mean today, no serious gamers will be caught playing with onboard Realtek or Sigmatel.
Ahem!

Anyway, so far I have not seen good use of physx. It has potential, yes. But I am waiting for something big to happen. When Physx or whatever makes a big impact I will factor it into my next purchase. I am pretty neutral between ATI and NV and as long as either one comes out with a good product or feature that is the one I will support.

Anyway, DX11, physx, stream, dx10.1... Many people have not even heard of this stuff. :Q
Well, there are a few games you can use Physx in, but remember when ATI came out with a DX10.1 part and everyone went "OOOHHH, AHHHH!" and then we looked again and said "hey wait...where's all the DX10 games? Why do we need 10.1? *scratch head*"

That's what's going to happen with DX11. The developers can't keep up and also provide good optimized code for their games. As far as I know, Physx is sort of a drop in for many titles. It doesn't take much effort to add it, especially not if you partner up with NV for TWIMTBP. Now, I'll admit that current usage of physx is a joke. There's no benefit to the gameplay. However, there is some potential but it will take a willing developer for that potential to be realized.
 

VERTIGGO

Senior member
Apr 29, 2005
826
0
76
What do you think to AMD's response to Nvidia's recent success with PhysX?
What success?

There is no single title that makes me wish I had PhysX. With Crossfire, I get almost 100% performance increase, sometimes more, sometimes less. But PhysX will always be a gimmick unless it becomes an open standard which can seamlessly load whatever is available, be it Intel cpu or Nvidia or ATi gpu.
 

shangshang

Senior member
May 17, 2008
830
0
0
Originally posted by: VERTIGGO
What do you think to AMD's response to Nvidia's recent success with PhysX?
What success?

There is no single title that makes me wish I had PhysX. With Crossfire, I get almost 100% performance increase, sometimes more, sometimes less. But PhysX will always be a gimmick unless it becomes an open standard which can seamlessly load whatever is available, be it Intel cpu or Nvidia or ATi gpu.
100% increase? more than 100% increase??? There isn't anything in the Crossfire architecture suggesting such synergism.

Physx isn't a gimmick. It has the technical validity, just not user base mass right now, but this could change soon. EAX from Creative didn't start life like it is today. Time will tell. On a side note, I remember hearing about Linux in 1990. I also remember hearing that Linux will take over the MS desktop in 2000 when the tech boom was at its height. The Linux fanboys swore it would happen. And now here we are almost 2 decades after Linux ran on the desktop, Windows is still the desktop queen. Was Linux a desktop gimmick?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
So far, the best implementation I have seen in a game is in GRAW2. Not just the effects of breakable environments, but the AI involved. The wooden picket fence for example. Shoot a hole in the fence, and then you could shoot through the hole at enemies while using the remaining fence for cover until it too is broken away by enemy fire. And the sniper towers. Shoot out the wooden beams that support the tower roof over the snipers head, and try to make it take out the sniper. Sometimes it would, and sometimes it woulnd't. Depends on if the sniper gets out of the way in time. Another thing was, if you threw a grenade into a wooden hut or shack, the walls would fly outward on detonation and if a piece of wall hit an enemy, it would wound or kill him. These are the things that intrigue me about physics the most. To me, this shows what the potential of PhysX can be given some time and close work with developers, it will be a real boon for the gaming world.

Warmonger, while not the best game on the block, did demonstrate some nice PhysX interaction.

And UT3. The twisters that pass over buildings and rip the tin roofs off and pick up and toss wood, stone bricks and other debris in a deadly fashion. If you get hit with the debris, you're a goner.

All these things do show the potential, whether some folks do not wish to recognize these things or not. It is there to be experienced and appreciated.

First major new title is out in January. So it's really starting to happen now. Now EA and 2K games are into the fray.
So, PhysX is gaining the momentum many thought would never take place. In fact, many thought it would be dead by now.
So, as far as AMD's comments are concerned, it sure doesn't look like PhysX is dying to me. Quite the opposite.
But we can't really blame AMD for these comments. What else could they have said about PhysX? Not a thing.
 

s44

Diamond Member
Oct 13, 2006
9,426
16
81
Originally posted by: keysplayr2003
What else could they have said about PhysX?
"We will support the project to port CUDA (and PhysX) to our products."

Not likely, but they had the choice.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
So I was checking out Cryostasis which adds some physx water effects. Personally I like the Havok water better, even if it isn't as complex.

Cryostasis PhysX benchmark
http://www.vimeo.com/2378621

BioShock Water Trailer
http://www.gametrailers.com/player/19039.html

Personally my only issue is how developers handle non-gpu physx cards. For every gpu physx item in a map there should be a non physx item. For example the missing banners in mirrors edge.
 

thilanliyan

Lifer
Jun 21, 2005
11,289
1,149
126
Originally posted by: keysplayr2003
So far, the best implementation I have seen in a game is in GRAW2. Not just the effects of breakable environments, but the AI involved. The wooden picket fence for example. Shoot a hole in the fence, and then you could shoot through the hole at enemies while using the remaining fence for cover until it too is broken away by enemy fire.
You don't need PhysX to do that. The same thing happens in Brothers in Arms: Hell's Highway. Even Crysis has fully destructible environments.
 

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
Originally posted by: thilan29
Originally posted by: keysplayr2003
So far, the best implementation I have seen in a game is in GRAW2. Not just the effects of breakable environments, but the AI involved. The wooden picket fence for example. Shoot a hole in the fence, and then you could shoot through the hole at enemies while using the remaining fence for cover until it too is broken away by enemy fire.
You don't need PhysX to do that. The same thing happens in Brothers in Arms: Hell's Highway. Even Crysis has fully destructible environments.
BIA installs physx BTW...uses the UT3 engine.
 

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
Originally posted by: SSChevy2001
So I was checking out Cryostasis which adds some physx water effects. Personally I like the Havok water better, even if it isn't as complex.

Cryostasis PhysX benchmark
http://www.vimeo.com/2378621

BioShock Water Trailer
http://www.gametrailers.com/player/19039.html

Personally my only issue is how developers handle non-gpu physx cards. For every gpu physx item in a map there should be a non physx item. For example the missing banners in mirrors edge.
What you're missing is that PhysX offloads all that processing to the GPU today. Havok doesn't and probably won't. Further, that Bioshock trailer is showing mostly pre-rendered scenes that are not changing dynamically. The Cryostasis demo had water that changed in real time depending on what you, the player, did to the environment.

I think you got stuck on how it looked instead of what it did which was the point of the cryostasis demo. It wasn't meant to look better than everything else. If it was based on looks then I'd say that Half-Life 2 had good looking water, but it didn't necessarily react to the environment in the same way as the physx demonstration of cryostasis.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: thilan29
Originally posted by: keysplayr2003
So far, the best implementation I have seen in a game is in GRAW2. Not just the effects of breakable environments, but the AI involved. The wooden picket fence for example. Shoot a hole in the fence, and then you could shoot through the hole at enemies while using the remaining fence for cover until it too is broken away by enemy fire.
You don't need PhysX to do that. The same thing happens in Brothers in Arms: Hell's Highway. Even Crysis has fully destructible environments.
Don't need, or don't want? If PhysX can do it, why not let it.
 

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
I agree with Keys here, there's really no reason not to like the idea of physx. It lets developers add these details while offloading it from the CPU and onto the GPU so the gameplay is still good and you don't tank the FPS.

Try Software Physx sometime and see how slow these complex calculations really run. Most of the physx demonstrations and usage thus far haven't been really spectacular, but it doesn't hurt anything and I'd rather my GPU do it than the CPU.
 

thilanliyan

Lifer
Jun 21, 2005
11,289
1,149
126
Originally posted by: cmdrdredd
BIA installs physx BTW...uses the UT3 engine.
I didn't know that. BIA was pretty good and as I mentioned, Crysis also had fairly impressive destructible environments which wasn't using PhysX AFAIK so it isn't needed.

Originally posted by: keysplayr2003
Originally posted by: thilan29
Originally posted by: keysplayr2003
So far, the best implementation I have seen in a game is in GRAW2. Not just the effects of breakable environments, but the AI involved. The wooden picket fence for example. Shoot a hole in the fence, and then you could shoot through the hole at enemies while using the remaining fence for cover until it too is broken away by enemy fire.
You don't need PhysX to do that. The same thing happens in Brothers in Arms: Hell's Highway. Even Crysis has fully destructible environments.
Don't need, or don't want? If PhysX can do it, why not let it.
It's best for us and the developers to make sure that the games can be played to their fullest on all hardware. If it leaves out a lot of gamers that's not good. Would you like to see 100% of the discrete market with nVidia just so that PhysX is beneficial to all? It may not make much of a difference to you but regular consumers will suffer if 100% of the market is dominated by nV.

I want a choice of which GPU I want in my computer without having to worry about not being able to play some games as intended. At certain times I prefer nV hardware (the 640 GTS and 8800GT I have are great cards) and sometimes I prefer ATI hardware (loved the X800XL, X800GTO2, and X1800XL I've had). I buy GPUs based on price/performance for the most part, not on any checkbox features like PhysX or DX10.1.
 

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
Originally posted by: thilan29
Originally posted by: keysplayr2003
Originally posted by: thilan29
Originally posted by: keysplayr2003
So far, the best implementation I have seen in a game is in GRAW2. Not just the effects of breakable environments, but the AI involved. The wooden picket fence for example. Shoot a hole in the fence, and then you could shoot through the hole at enemies while using the remaining fence for cover until it too is broken away by enemy fire.
You don't need PhysX to do that. The same thing happens in Brothers in Arms: Hell's Highway. Even Crysis has fully destructible environments.
Don't need, or don't want? If PhysX can do it, why not let it.
It's best for us and the developers to make sure that the games can be played to their fullest on all hardware. If it leaves out a lot of gamers that's not good. Would you like to see 100% of the discrete market with nVidia just so that PhysX is beneficial to all? It may not make much of a difference to you but regular consumers will suffer if 100% of the market is dominated by nV.
That's not even the point. The point is Physx is a good thing, albeit not very useful, but good. It offloads something that traditionally has been done by the CPU and puts the GPU to work doing it. We need more of this because a CPU can only do so much and a GPU is much more specialized and can be much faster for a variety of tasks.
 

thilanliyan

Lifer
Jun 21, 2005
11,289
1,149
126
Originally posted by: cmdrdredd
That's not even the point. The point is Physx is a good thing, albeit not very useful, but good. It offloads something that traditionally has been done by the CPU and puts the GPU to work doing it. We need more of this because a CPU can only do so much and a GPU is much more specialized and can be much faster for a variety of tasks.
For sure it's useful and I'd be all for it IF it was a standard. I don't like the idea that it is owned by one company. It needs to be a sort of universal API like Direct3D is (correct me if I'm wrong on that).
 

nosfe

Senior member
Aug 8, 2007
424
0
0
the CPU can only do so much, but in games the CPU isn't fully utilized while the GPU is. Why not make some physics effects that are simple enough so that a dual core can handle it, it's not like physics have to be 100% accurate
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: cmdrdredd
What you're missing is that PhysX offloads all that processing to the GPU today. Havok doesn't and probably won't. Further, that Bioshock trailer is showing mostly pre-rendered scenes that are not changing dynamically. The Cryostasis demo had water that changed in real time depending on what you, the player, did to the environment.

I think you got stuck on how it looked instead of what it did which was the point of the cryostasis demo. It wasn't meant to look better than everything else. If it was based on looks then I'd say that Half-Life 2 had good looking water, but it didn't necessarily react to the environment in the same way as the physx demonstration of cryostasis.
I know how slow physx runs without a PPU or GPU support. I still have my 8800GTS 512 which I've used before for UT3 PhysX maps.

The point was Bioshock was able to produce some nice water effects that might not be complex, but were enough to do the job at hand.

Now with the cryostasis demo we've seen what physx cards can do, but what about non physx card owners? All of sudden you'll see reviews which take a B games and turn it into a C game for users that can't support physx, because of the missing content.

Developers need to be very careful how they implement GPU physx right now.
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
Originally posted by: SSChevy2001

2009 is on the way, what triple A games would I want GPU PhysX for. Sure it adds some extra eye candy, but can it make a B-C rated game into A game?
would you update your driver if ATI come out with a PhysX driver today? if your answer is Yes, then you should be angry, because it would have been real if ATI had licensed PhysX many month ago.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: thilan29
Originally posted by: cmdrdredd
That's not even the point. The point is Physx is a good thing, albeit not very useful, but good. It offloads something that traditionally has been done by the CPU and puts the GPU to work doing it. We need more of this because a CPU can only do so much and a GPU is much more specialized and can be much faster for a variety of tasks.
For sure it's useful and I'd be all for it IF it was a standard. I don't like the idea that it is owned by one company. It needs to be a sort of universal API like Direct3D is (correct me if I'm wrong on that).
A standard has to be an idea, before it could be a standard.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: JACKDRUID
would you update your driver if ATI come out with a PhysX driver today? if your answer is Yes, then you should be angry, because it would have been real if ATI had licensed PhysX many month ago.
I update my drivers whenever there's a update.

What type of message would ATi be sending out to their customers by having a Nvidia PhysX logo on their box?

Even if ATi did have PhysX, what's to stop Nvidia from making sure it doesn't run better on ATi's hardware?

Personally ATi needs focus more on it's own GITG program right now.
 

SickBeast

Lifer
Jul 21, 2000
14,377
17
81
IMO people should be thankful that NV purchased PhysX; the technology was way more marginal back when it was relegated to specialized cards. If they had not done this, chances are we would not be seeing physics in DX11, and AMD would not be moving toward GPU physics (or it would have been delayed).

People moan that it doesn't alter gameplay, but how can it when not everyone has a GeForce? Not only that, but things like fully destructible environments have already been done. It's going to take some creativity on the part of the developers to come up with something that isn't simply offloading work from the CPU.
 

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
Originally posted by: thilan29
Originally posted by: cmdrdredd
That's not even the point. The point is Physx is a good thing, albeit not very useful, but good. It offloads something that traditionally has been done by the CPU and puts the GPU to work doing it. We need more of this because a CPU can only do so much and a GPU is much more specialized and can be much faster for a variety of tasks.
For sure it's useful and I'd be all for it IF it was a standard. I don't like the idea that it is owned by one company. It needs to be a sort of universal API like Direct3D is (correct me if I'm wrong on that).
Well, I don't care who owns it. Intel bought up Havok remember.

If games use physx and they license from Nvidia to use it then fine, ATI will have to jump in and no it won't cost ATI a penny. The developers will be paying NV for the license.
 

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
Originally posted by: SSChevy2001
Originally posted by: cmdrdredd
What you're missing is that PhysX offloads all that processing to the GPU today. Havok doesn't and probably won't. Further, that Bioshock trailer is showing mostly pre-rendered scenes that are not changing dynamically. The Cryostasis demo had water that changed in real time depending on what you, the player, did to the environment.

I think you got stuck on how it looked instead of what it did which was the point of the cryostasis demo. It wasn't meant to look better than everything else. If it was based on looks then I'd say that Half-Life 2 had good looking water, but it didn't necessarily react to the environment in the same way as the physx demonstration of cryostasis.
I know how slow physx runs without a PPU or GPU support. I still have my 8800GTS 512 which I've used before for UT3 PhysX maps.

The point was Bioshock was able to produce some nice water effects that might not be complex, but were enough to do the job at hand.

Now with the cryostasis demo we've seen what physx cards can do, but what about non physx card owners? All of sudden you'll see reviews which take a B games and turn it into a C game for users that can't support physx, because of the missing content.

Developers need to be very careful how they implement GPU physx right now.
Those features won't be missed, the CPU will do them and your FPS will drop some. Or you can turn it off, your choice.
 

cmdrdredd

Lifer
Dec 12, 2001
26,835
278
126
Originally posted by: nosfe
the CPU can only do so much, but in games the CPU isn't fully utilized while the GPU is. Why not make some physics effects that are simple enough so that a dual core can handle it, it's not like physics have to be 100% accurate
Many games don't use 100% of the GPU either. There are many threads available inside a GPU and stream processors etc that can do specialized work. Just needs the code to tell them what to do.
 

thilanliyan

Lifer
Jun 21, 2005
11,289
1,149
126
Originally posted by: keysplayr2003
A standard has to be an idea, before it could be a standard.
Okay I really don't know the answer to this:
What would ATI have to pay nV to be able to use PhysX?

If the answer is something other than nothing then I see a problem with it (ie. I'm not sure but does MS charge ATI or nV to use Direct3D?).

Originally posted by: cmdrdredd
If games use physx and they license from Nvidia to use it then fine, ATI will have to jump in and no it won't cost ATI a penny. The developers will be paying NV for the license.
Are you sure it won't cost ATI anything? To me it doesn't make a whole lot of business sense for nV to charge absolutely nothing...PhysX is one of their selling points and they lose that if they give it away to ATI for free.

Also, Intel has some sway in this as well. I'm sure Intel has more cash to spread around for adoption of whatever method they choose so until they've picked a side I doubt we'll see a "winner".
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
0
0
Nvidia is competing with Intel wih PhysX significantly moreso than with AMD/ATI. It behooves Nvidia to make PhysX as widespread as possible because it helps the GPU keep center stage and potentially lets them sell you extra hardware if you want extra performance for PhysX.

AMD/ATI is probably somewhat conflicted with what to do with Physics. If they push any GPU-accelerated solution all that hard, it's likely to hurt them long term in the consumer CPU space.

Waiting for Dx11 isn't really a solution either. The compute shader is really more like a generic version of CUDA rather than a middleware layer like PhysX or Havok. You might as well use PhysX as that middleware as it will likely get dx11 CS support either directly or through CUDA.
 

ASK THE COMMUNITY