AMD says Game devs only use PhysX for the cash

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GodisanAtheist

Diamond Member
Nov 16, 2006
8,325
9,706
136
It would go a long way to explain why Physx titles tend to be B-level titles. Get paid, add features for no extra effort and a tick mark on the box?! Fuck yeah!

Good for indie devs/low budget projects.
 

VashHT

Diamond Member
Feb 1, 2007
3,351
1,431
136
Batman with/without PhysX effects was like night and day for ambiance and immersion. In that title it's a much bigger difference than AA (IMHO).

Could it have been done on the CPU? Sure.

I really don't care who "wins" the physics API battle. I want gameplay that's affected by realistic physics that costs the least in terms of overhead, no matter how it's provided.

I think this is most people's problem with how nvidia does physx. They could have added those effects without forcing people to use gpu physics, but they didn't. It's just an arbitrary line they drew, and stuff like that pisses people off.
 

shaolin95

Senior member
Jul 8, 2005
624
1
91
They could have added those effects without forcing people to use gpu physics, but they didn't.
And how do you know this?
You really do not have any proof that it could be done on CPU, you are assuming here without evidence.
I do not have proof that it cannot be done but its obiovus that there are some calculation that GPUS are a TON faster at doing the best CPUs out there.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
everything physx can do the cpu can do with the proper code. So developers are getting paid not to utilize the cpu to achieve the same effects.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
And how do you know this?
You really do not have any proof that it could be done on CPU, you are assuming here without evidence.
I do not have proof that it cannot be done but its obiovus that there are some calculation that GPUS are a TON faster at doing the best CPUs out there.

Umm cus things like that have been done in other games. Like I've said many times, when a good game comes out and uses Physx in a meaningful way, then come talk to me. There is nothing out yet and doesn't seem to be anything in the horizon. Also it's not us that said Physx would change the way we played our games. Nvidia did.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
And how do you know this?
You really do not have any proof that it could be done on CPU, you are assuming here without evidence.
I do not have proof that it cannot be done but its obiovus that there are some calculation that GPUS are a TON faster at doing the best CPUs out there.

You're right there's no way the CPU can handle all the calculations involved in the GPU PhysX effects we've seen but at the same time most of the GPU PhysX effects we've seen have been gross abuse of PPU, WAY overcalculating the physics of pieces of paper floating around (just to make sure there's too many calculations for the CPU to handle?).

Yes they probably got very realistic results but gaming is all about using approximate algorithms that give you the same impression of realism at 1/10th the cost. We don't need to calculate the actual dynamics of a each floating piece of paper to make it look like pieces of paper are floating around..

Once GPU physics really catches on it'll be inherent in every part of the physics process and then we will see cool stuff (eyecandy or otherwise) instead of 2 or 3 effects that are overcalculated for the sake of calculating.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You're right there's no way the CPU can handle all the calculations involved in the GPU PhysX effects we've seen but at the same time most of the GPU PhysX effects we've seen have been gross abuse of PPU, WAY overcalculating the physics of pieces of paper floating around (just to make sure there's too many calculations for the CPU to handle?).

Yes they probably got very realistic results but gaming is all about using approximate algorithms that give you the same impression of realism at 1/10th the cost. We don't need to calculate the actual dynamics of a each floating piece of paper to make it look like pieces of paper are floating around..

Once GPU physics really catches on it'll be inherent in every part of the physics process and then we will see cool stuff (eyecandy or otherwise) instead of 2 or 3 effects that are overcalculated for the sake of calculating.

Also, if you can get the effect by using crappy inefficient code rather than efficient well executed code, programmers will do it every single time. Not sure why that is, but it's a basic truism, it seems. :D
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
You're right there's no way the CPU can handle all the calculations involved in the GPU PhysX effects we've seen but at the same time most of the GPU PhysX effects we've seen have been gross abuse of PPU, WAY overcalculating the physics of pieces of paper floating around (just to make sure there's too many calculations for the CPU to handle?).

Yes they probably got very realistic results but gaming is all about using approximate algorithms that give you the same impression of realism at 1/10th the cost. We don't need to calculate the actual dynamics of a each floating piece of paper to make it look like pieces of paper are floating around..

Once GPU physics really catches on it'll be inherent in every part of the physics process and then we will see cool stuff (eyecandy or otherwise) instead of 2 or 3 effects that are overcalculated for the sake of calculating.


+1, thumbs up for some very good points.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
Yeah sure, nvidia is ‘open to talking with any GPU vendor about support for their architecture (PhysX).’

Even if it's free to developers, "open to talking" probably means if ATI wants it they have to pay NV for it.

As long as ATI's physics platform can be used on both ATI and NV hardware, PhysX won't survive unless NV keeps throwing money at it.

Furthermore, since the bulk of profit comes from console games; which have no GPU-accelerated physics, game developers won't bother adding it to the PC port, unless they're paid too which won't bode well for ATI's physics platform either.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
I love it when people still use the ONLY eye candy card....so what is AA or AF or many of the features you guys love so much about video cards?
Do tessellation, AA or AO change gameplay? NO yet I see ATI guys hyping tessellation as the second coming....

Lets pick AA.

Lets say without AA your frame rate is 50 fps.

Then you turn on AA and your frame rate is 20.

Would you turn it on? I wouldn't.

Now lets say your frame rate would only drop to 45.

I would certainly turn it on.

Now lets say you have a game where you have a scene that you bash an enemy and the enemy will fly in a certain direction.

A way of doing it will always send the body in the right direction. Another one sometimes send the body flying in non-exactly the right direction.

The cost of the first option is for you having lower frame rates and/or needing an extra GPU and being restricted in your buying options to a specific vendor.

Now lets say you are killing an horde of enemies and you are so much focused in to the fight that you can't really remember exactly how you hit the enemy and so don't really care if the guy is flying backwards when he should be projected to the left.

If that case happens, do you really think the trade-off is worthwhile?

Then you have games like painkiller or even titan quest "loot being spit" out of the chest and rolling around. Am I going to be more impressed throwing a massive stake at some dude and see him fly and getting nailed to a wall or some papers being kicked in a realistic way? I can kick papers in a realistic way myself but I can't exactly go around nailing people to walls (or at least do so and be socially accepted).
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
You've got to separate gpu accelerated physx and cpu accelerated physx.

Cpu accelerated physx works just as well as any other physics library (e.g. havok), it can run on as many cores as the developer wants, it is efficient. It is also free and runs on everything, hence it has become the most used physics library on both PC and all the consoles. Huddy is dead wrong here - devs use physx because it's free and well supported.

Gpu accelerated physics is not free and is only supported by geforce PC cards (although I suppose you could argue the SPE's on the PS3's cell processor aren't really a traditional cpu and it runs on them). As I understand it, it is true that whenever you see those effects nvidia paid to add them - well by paid they didn't actually give the devs money they gave them developers to do the work for them as part of TWIMTBP. Those nvidia devs will also add the 3D stuff and other stuff the game companies devs haven't got time to do (e.g. AA support).

In this case Huddy is right. The problem is no game dev would add any of these things if nvidia didn't provide the manpower to do it, so all we'd get is straight console ports. Hence if we lived in Huddy's world (and there was no nvidia) there would be no special extra's for PC gamers unless the dev's actively added them, which they don't bother doing, so it essentially boils down to straight console ports.

At least nvidia is trying to make PC gaming stand out (if in a fairly selfish way), amd whines a lot about it but seem incapable of actually doing any thing. As they say "talk is cheap".

In reference to "game changing" hardware physics effects: that won't happen till most of the market can support hardware physics. This has very little to do with PC's - it will happen when the consoles support hardware physics. i.e. after the PS4 and Xbox 720 arrive. Until then hardware physics will just be there to make stuff look pretty, which is still better then not being there at all.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
interesting in the bit-tech article the author linked to that nvidia is providing pyshx as an open platform.

"Nvidia’s director of product PR for EMEA and India, Luciano Alibrandi, told Custom PC that ‘We are committed to an open PhysX platform that encourages innovation and participation,’ and added that Nvidia would be ‘open to talking with any GPU vendor about support for their architecture.’"

Guess we'll see what comes of it later...


http://www.bit-tech.net/custompc/news/602205/nvidia-offers-physx-support-to-amd--ati.html

AMD has always been the barrier for AMD GPUs delivering PhysX on the GPU. That is old news.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
This is the last thing AMD wants - to be dependant of NVIDIA. While NVIDIA provides physX free to developers (actually it even pays some to use it) providing it free to AMD is a completely different story.

AMD probably will have to take a more aggressive stance to interest devs in open-standard physics.

Or maybe stuff will only change with next console gen.

And yet they had no problem signing onto their biggest competitor Intel's Havok.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I think our gripe is that GPU Physx is not being used for essential parts of the game that effect strategy. Instead it always seems to get relegated to the eye candy role.

Tessellation and Anti-aliasing are different than Physx because they were never meant to be involved in game play strategy.

They cant add in true object interaction considering the playerbase is so small that can actually run soemthing like that. How would you handle an online FPS where one see's a crumbling building and the other cant? It will take time before we get a base measure of performance where something like that can actually happen. It may take years. One thing that would help this along greatly is for Microsoft to get off their ass and implement it in DirectX.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
everything physx can do the cpu can do with the proper code. So developers are getting paid not to utilize the cpu to achieve the same effects.

Dont you mean a CPU could do everything a GPU can do with PhysX if coded? Albeit a hell of a lot slower.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
They cant add in true object interaction considering the playerbase is so small that can actually run soemthing like that. How would you handle an online FPS where one see's a crumbling building and the other cant? It will take time before we get a base measure of performance where something like that can actually happen. It may take years. One thing that would help this along greatly is for Microsoft to get off their ass and implement it in DirectX.

http://www.youtube.com/watch?v=V_NJLMwZN1c&feature=related
Particles DirectCompute

http://www.youtube.com/watch?v=uhTuJZiAG64
N-body DirectCompute

http://www.youtube.com/watch?v=K1I4kts5mqc&feature=related
Ocean/waves DirectCompute

http://www.youtube.com/watch?v=dXy_ssSGuy0&feature=related
N-body OpenCL

http://www.youtube.com/watch?v=-7yTRxJhVps
Smoke particles OpenCL

http://www.youtube.com/watch?v=RoIyRoANAy4
Destruction OpenCL


Microsoft shouldn't be implementing anything, developers should.
The problem is that there's no real impetus to do so. Why bother adding OpenCL or DirectCompute paths to the game code when only PCs can take advantage of it?
The main thing that needs to happen is that Havok needs to sort out an OpenCL/DirectCompute link, NV needs to do the same, and Bullet/DMM need to start pushing their solutions.

Plus AMD need to get off their asses and make sure their OpenCL/DirectCompute support is there, and we need the cards to become widespread (which they are) and for baseline performance to be sufficient, and eventually consoles to start supporting the sorts of technologies which might allow better physics (which can't really happen until the next gen comes round).
 
Last edited:

formulav8

Diamond Member
Sep 18, 2000
7,004
523
126
The only thing i've seen AMD whine about is that nVidia is making it a 'Closed' standard.

From my point AMD has a valid argument in wanting a open standard for phyics. They want it to keep from having to pay licensing fee's or keep from being hurt by being locked out of physX and I want an open standard so I don't have to deal with yet another standard being locked to a single product or manufactor.



Jason
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I found it very funny. Every time Richard Huddy makes a speech, fans simply believe in every single word he said as if he is the god, without a single question.

Seriously, who are the developers he spoke with? Names? Quotes? What about a short youtube feeds? How much are they being paid? What exactly are they being paid for? Who exactly is paying them?

Sorry, but Physics isn't what ATI is strong at. At best they claimed to support something that is open source, meaning they don't need to pay a dime for it and don't held any responsibility.

Havok physics isn't bad, but the problem is, there isn't a lot of templates or API to make things easier. Seriously, what does physics have anything to do with Pac-Man? Oh wait, Pac-Man isn't good anymore. Wait, what makes game age? What is the differences between Doom and Doom 3? Starcraft and Starcraft 2?

If you ever youtube SC2, you will know that the gameplay is identical to SC, so what makes it new? Well SC2's graphics is by far the best graphics compare to older real-time strategy game.

So are programmers being paid to make SC2? Of course. It also fully support 3D and has lots of physics. Guess what, it uses havok. Why not PhysX? well they don't want to limit their potential of graphics by vendors, which is okay. In fact, it runs on OpenGL, meaning it isn't platform specific. The cool thing is, it runs on Dx9. While every things are good, how long did it take to develop the game? lol, 5+ years. Starcraft was released March 1998.

So what is Dx10, Dx11 and Tessellation? Well all function points were there in Dx9, at least there were possible in Dx9. Anything that can be done by Dx11 and be done with OpenCL/GL. So why the DirectX label. What about Mac user?

Of course most people don't care because MS dominated the OS market. From time to time there are some free OS like unix, linux, redhat, and ubuntu. They are free, but it isn't a secret how difficult to use it. Bugs and limitations are something that user understand and tolerate on something that is free. Of course there are programmers who can fix it for themselves and then there is google-fu.

If you ever try to make your own game, then you will know the pain. No, a game like Pac-Man won't sell. A game like Doom won't sell. Anything that doesn't require mid-high end PC simply won't sell. So how can I create a game that does require those?

Game Engine saves the day. Unfortunately, other than PhysX there are no other premade tools for physics. Yes Havok is free, but how do you use it? Call ATI and ask? PhysX on the other hand is easier, they have premade "interactive papers and objects" and Nvidia actually send guys/gals to assist on that and other matters. Guess what, I can borrow some of the pre-existing code and then add a new stuff in it, and bamp a new game.

While people may believe that PhysX is nothing but flying paper and the TWIMTBP is used to make ATI look bad. However, the fact is, many games won't be able to come to live before it ran out of resources without it. Using those Game Engine requires you to pay up at the time of publish, but not TWIMTBP.

Ownership means control and responsibility. MS owns and control Dx. MS can alter whatever they want and then give it a new name and won't share it with other OS, but at the same game MS needs to make sure that it works and works better than others to keep it alive. Same thing with PhysX.

If Nvidia never have something proprietary to them, then there probably won't be programs like TWIMTBP. Without TWIMTBP, then many games will never come to live. In fact, if Nvidia's hardware isn't needed, then there won't be Nvidia. Who is at lost here? US, WHO LIKES TO PLAY GAMES!!!

Just like Nvidia, ATI put a hardware tessellation unit with their card to counter PhysX, had "Get in the Game" to counter TWIMTBP and stream to counter CUDA. There were games that uses tessellation many years ago, ATI did the same thing. The difference is, ATI failed at the time and tessellation went under the table.

Richard Huddy mislead a lot of people. Open standard is good, but that doesn't mean free. Stream don't run off Nvidia hardware as CUDA don't run off ATI. Stream and CUDA more or less does the same thing but with a different standard, which only one will survive. Was ATI as good as you think? If ATI is willing to follow the standard made by Nvidia, then ATI user can enjoy PhysX and probably CUDA too. Nvidia brought Aegis for PhysX, ATI brought nothing. Freeware doesn't mean ATI. Open source doesn't mean ATI. ATI is cheap on those stuff so they can make cheaper video cards. This really isn't rocket science.
 
Last edited:

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
Richard Huddy mislead a lot of people. Open standard is good, but that doesn't mean free. Stream don't run off Nvidia hardware as CUDA don't run off ATI. Stream and CUDA more or less does the same thing but with a different standard, which only one will survive. Was ATI as good as you think? If ATI is willing to follow the standard made by Nvidia, then ATI user can enjoy PhysX and probably CUDA too. Nvidia brought Aegis for PhysX, ATI brought nothing. Freeware doesn't mean ATI. Open source doesn't mean ATI. ATI is cheap on those stuff so they can make cheaper video cards. This really isn't rocket science.

Did you miss this thread?

There were two major players in the physics platform, nVidia bought Ageia, Intel bought Havok.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Richard Huddy mislead a lot of people. Open standard is good, but that doesn't mean free. Stream don't run off Nvidia hardware as CUDA don't run off ATI. Stream and CUDA more or less does the same thing but with a different standard, which only one will survive. Was ATI as good as you think? If ATI is willing to follow the standard made by Nvidia, then ATI user can enjoy PhysX and probably CUDA too. Nvidia brought Aegis for PhysX, ATI brought nothing. Freeware doesn't mean ATI. Open source doesn't mean ATI. ATI is cheap on those stuff so they can make cheaper video cards. This really isn't rocket science..

Will Intel follow AMD and get a license for SSE4A? Will Intel license x86 to nVidia? Will Apple license their platform to HP? Bussiness is bussiness and getting a license from your competitor doesn't make sense in bussiness, specially with GPU PhysX which its adoption isn't very broad. I think that Stream is only a temporary solution until the wide adoption of OpenCL/DirectCompute and CUDA for sure will take off on the GPGPU market, not in the consumer market, how many applications exists currently for the regular consumer that take advantage of such API's?

Cheaper videocards doesn't have nothing to do with trimming features like GPGPU computing. ATi's position is videocards for gaming and GPGPU functionality, nVidia's position is GPGPU cards with gaming functionality, vast different ways. So neither approach is perfect, but a moderate die size with excellent performance per mm2 square looks better than a huge size with huge costs and high price.

You're right there's no way the CPU can handle all the calculations involved in the GPU PhysX effects we've seen but at the same time most of the GPU PhysX effects we've seen have been gross abuse of PPU, WAY overcalculating the physics of pieces of paper floating around (just to make sure there's too many calculations for the CPU to handle?).

Yes they probably got very realistic results but gaming is all about using approximate algorithms that give you the same impression of realism at 1/10th the cost. We don't need to calculate the actual dynamics of a each floating piece of paper to make it look like pieces of paper are floating around..

Once GPU physics really catches on it'll be inherent in every part of the physics process and then we will see cool stuff (eyecandy or otherwise) instead of 2 or 3 effects that are overcalculated for the sake of calculating.

Battlefield Bad Company 2 uses mouth watering Phisics effects and they run in the CPU, Cryostasis Phisics effects looks like crap and runs like crap. Your points are very valid.

Also, if you can get the effect by using crappy inefficient code rather than efficient well executed code, programmers will do it every single time. Not sure why that is, but it's a basic truism, it seems. :D

I couldn't agree more with you, most developers wouldn't multi thread the PhysX code to run on CPU, so hence the weak performance of PhysX when its run on the CPU, most of them uses only one Core, it might be intentional I guess. Something has to be justified for GPU PhysX usage.
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76

First, as far as I know TruForm was never an IP, just a trademarked name for tessellation which is just a rendering method much like AA, AF etc are all methods of improving graphics and not a specific IP. So anyone would be able to do it just by setting up the vertex buffers properly (which NV has demos of doing back on G80 hardware as well).

This is similar to Eyefinity today.. nothing is stopping nVidia from surround monitors (and they are soon), it's just ATI's trademarked name. This differs from PhysX in that PhysX is an IP, and has licensing fees and restrictions involved. TruForm isn't an example of ATI vendor locking a feature.

So, while I don't disagree that a part of why ATI goes for open standards (does not mean the same thing as open source) is to save money, it's not something they've necessarily been hypocritical about. It's also a good business practice and better for the industry as a whole (re Glide vs DirectX). Also I don't believe 3rd parties do the work stymies innovation.

For example, 3d glasses existed on PC for both Nvidia and ATI long before Nvidia's own:
http://www.techspot.com/reviews/hardware/ed_3dglasses/
Still exist for ATI:
http://www.xbitlabs.com/articles/monitors/display/iz3d.html
And soon newer, better ones than we've even seen yet:
http://www.youtube.com/watch?v=1DU4u5Z133k
The real hurdle has been waiting for 120hz LCD monitors to come along since we lost the capability switching off CRT, not because AMD hasn't been pushing it or because nVidia has been. Now that 120Hz monitors exist a 3rd party will make some money off making 3d glasses again and there was no need for a proprietary ATI solution. Market forces themselves can provide one..

As for accelerated physics, AGEIA made PhysX. Not Nvidia. 3rd party innovation again. The only reason PhysX is vendor locked to Nvidia is because Nvidia bought it.

If AGEIA still controlled it, we don't know what the situation would be. Most likely the PPU device would have failed to make money and they would have evolved PhysX into an open standard running on OpenCL with licensing fees to developers, or a proprietary standard with licensing fees to nVidia and ATI. But by being a neutral party it's a much safer business proposition to enter into for either company (and no AMD isn't so cheap as to not license tech. They're already paying licensing fees for every HDMI port)

Either way we'd possibly already have a ubiquitous physics platform by now, at least for PC.
 
Last edited by a moderator:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
As opposed to AMD's PR wing coming out and saying they absolutely LOVE physx?
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
what are some good examples of the use of PhysX?


I see Batman brought up. I haven't played it but seen some Youtube videos of physX vs non. One part batman is kicking up a lot of papers. That was stupid. Another parts show some flags that can be torn vs. the non version. That looked kinda cool. Don't know if it really added to the game but it did look nice.

Why the kicking of the papers though? Was he fighting the villian from Office Space who throws around files all out of order?

Trine's gameplay was mainly based around using physics to traverse through the zones. It used PhysX.

http://www.youtube.com/watch?v=bhSx5zZy9OQ


And a notable mention for this topic is Metro 2033:
http://www.pcgameshardware.com/aid,706182/Exclusive-tech-interview-on-Metro-2033/News/

They said they supported it before it was even Ageia's. Because it is a good physics engine.

And games with varying degree's of PhysX use list:
http://en.wikipedia.org/wiki/PhysX#Games


Mirror's Edge:
http://www.youtube.com/watch?v=uGvNpYUyMcY&feature=fvw
 
Last edited: