AMD talks GPU Gaming Physics

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
For those that didn't RTFA. Manju Hegde was a co-founder of Ageia and worked at nVidia for a bit after PhysX was snapped up by nVidia. He currently works for AMD so he's not exactly an impartial observer. Still, I think some of what he says makes a lot of sense and somewhat confirms what a lot of us thought about PhysX.

Manju Hegde said:
My whole point in starting Ageia was to make physics mainstream, so Nvidia has a few PhysX games - I was at Nvidia for a couple of years, and we did get a few games - but I can tell you that it's still not easy to get a game developer to use physics in a meaningful way.

Pretty much a confirmation that nVidia has had to bend over backwards for anyone to use PhysX and that game physics has not really been used in any way that helps advance gameplay. Without nVidia support, I doubt any developer would voluntarily use PhysX in its current form.

AMD would be foolish to license that because it would just be an engineering nightmare. I'm just talking in the abstract here, but to me it doesn't make sense, and I think Nvidia's being disingenuous by making a claim like that. If it was a standard and open system, like Khronos does, then we would have a lobby so we could make changes in the API, but that's not the same with a proprietary API.
The "that" Hegde was referring to was PhysX. I know there was a lot of nVidia fanboys who were swearing up and down about how AMD was holding back gaming by not licensing PhysX. Well, the guy who helped create PhysX is saying it would have been a pain in the rear for AMD to support PhysX in its current form and that until it gets ported to something like OpenCL, don't expect AMD to support it.


I agree with thilanliyan that it's another article about AMD/ATI talking about their future direction, which they're still talking about years later. We've seen it with their GPU accelerated encoding/transcoding, their GPGPU efforts, and their physics efforts. It does give you hope of something concrete coming out with their recent hires as mentioned by Hegde but at the same time, we've heard the same song over and over and over from AMD.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
Yeah, it's a decent article but it doesn't even really try to put the issue in context.

I see the current video game industry as similar to the old movie industry. For a long time special effects movies were for the most part low budget horror, fantasy, and scifi films with the occasional higher budget like "2001 a Space Odyssey". Then along came another low budget film called "Star Wars" that showed just how profitable special effects movies could be with the right technology and lots of seamless special effects.

Without a doubt the current generation of graphics cards and soon to be released 8 core processors are capable of some serious physics and AI, but it will be years before such things become commonplace. Nor would I hold my breath waiting for the next generation of consoles either with video game developers producing maybe a handful of really successful titles a year and watching cheap indie games eat into the sales of the rest. It looks like we'll just have to wait for another George Lucas to put the pieces together on a shoe-string budget and show them how its done.
 

sandorski

No Lifer
Oct 10, 1999
70,677
6,250
126
Yeah, it's a decent article but it doesn't even really try to put the issue in context.

I see the current video game industry as similar to the old movie industry. For a long time special effects movies were for the most part low budget horror, fantasy, and scifi films with the occasional higher budget like "2001 a Space Odyssey". Then along came another low budget film called "Star Wars" that showed just how profitable special effects movies could be with the right technology and lots of seamless special effects.

Without a doubt the current generation of graphics cards and soon to be released 8 core processors are capable of some serious physics and AI, but it will be years before such things become commonplace. Nor would I hold my breath waiting for the next generation of consoles either with video game developers producing maybe a handful of really successful titles a year and watching cheap indie games eat into the sales of the rest. It looks like we'll just have to wait for another George Lucas to put the pieces together on a shoe-string budget and show them how its done.

Star Wars was not "Low" budget. IIRC, it was the largest Budget movie of all time, up until then.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Your mainstream games are all about return on investment. If you can just take an FPS game, change the models, add some updated graphics, and rework the weapons/damage charts, and it will sell as much as a ground up production, then why bother to do more? Especially, when you have some devs who have a one patch only policy (Activision). If you were to design something all new, just the QA and BETA testing would increase the cost 10 fold. Why bother if you can sell a painted up pig for the same amt. of money?
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
I stand corrected, it cost $11 million dollars which was $2 million more then the other big special effects hit of the time "Jaws". Nonetheless, quite low budget by the modern standards of $150-300 million for a special effects movie. In contrast a AAA video game costs a mere $50-70 million to produce while still pulling in similar revenues. In other words, they are still in the relatively low budget category despite their success and the obvious improvements that can be made to physics and AI. Sooner or later someone will step in to show the bean counters how its done.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Am I the only one who isnt excited about GPU physics at the moment? I mean GPUs already have their hands full just doing the graphics. Unless adding a 2nd GPU becomes mandatory or GPUs become much more powerful. I think physics is better off on the CPU.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Consoles.
They don't have hardware physics, so it's difficult to add it at a base level for games.
That's what's been holding proper game-changing physics back, and it's why PhysX on the GPU is all of a nothing in terms of the potential of hardware physics.

Given the decline in PC-only games, and the fact that often those games which are PC only are lower budget or mass market, it becomes difficult to really do anything.
PhysX so far has been nothing close to what it could have been if things were more supporting (not just NV vs AMD, but PC vs consoles as well). The best hope is for some form of OpenCL support on next generation games consoles, and lots of GPU power.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Co-founded by Manju Hegde, a new start-up called Ageia ambitiously launched its dedicated PhysX accelerator; a technology that's now a part of Nvidia's family
However, last year Hegde was poached by AMD to join the Fusion team, prompting all sorts of speculation about AMD's plans for competing with PhysX.
Ahhh yes... hire the guy that made PhysX :) the GPGPU of the APUs has to be used for something if you throw in a discrete GPU with it right?

In short, it looks as though AMD is now putting some serious money behind gaming physics, and with a developer-friendly business model, not to mention wide-ranging hardware support, Bullet Physics has the potential to take over from CUDA-accelerated PhysX. Whether this will translate into fully fledged game-changing physics remains to be seen, but if future consoles use OpenCL-compatible GPU hardware (and they probably will), and GPU-accelerated physics on PCs indeed opens up to multiple hardware platforms, then it looks as though gaming physics might actually start to take off in future.
They mention like 4 games that use bullet physics, and how all the big movie companies use bullet physics for their movies.... and theres alot of talk about the OpenCL physics that is bullet physics..

I just want to see it in alot of games now ^-^ enough talk, get it in games.



Am I the only one who isnt excited about GPU physics at the moment? I mean GPUs already have their hands full just doing the graphics. Unless adding a 2nd GPU becomes mandatory or GPUs become much more powerful. I think physics is better off on the CPU.

Mainstream will have the APU's GPU do the Bullet Physics, and the discrete card doing the actual game grafics, so you could run Bullet Physics in games without a performance hit. ^-^
 
Last edited:

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
Am I the only one who isnt excited about GPU physics at the moment? I mean GPUs already have their hands full just doing the graphics. Unless adding a 2nd GPU becomes mandatory or GPUs become much more powerful. I think physics is better off on the CPU.


Physics is better off on both the cpu and gpu. There's only so much even an 8 core cpu can do with physics. Past a certain point you need something like 80 cores or more which only a gpu can provide. That graphics cards will become seriously more powerful just in the next four years goes without saying. Four years ago it would have taken a $10,000.oo rig to play crysis maxed out, while now a $1,500.oo can do the trick.

I think Lonyo hit the nail on the head. The next generation of consoles should have quad core processors and graphics cards capable of physics and AI. Then maybe you'll understand just what you've been missing all these years. It isn't just a marketing gimmick by any stretch of the imagination.
 
Last edited:

deimos3428

Senior member
Mar 6, 2009
697
0
0
I stand corrected, it cost $11 million dollars which was $2 million more then the other big special effects hit of the time "Jaws". Nonetheless, quite low budget by the modern standards of $150-300 million for a special effects movie. In contrast a AAA video game costs a mere $50-70 million to produce while still pulling in similar revenues. In other words, they are still in the relatively low budget category despite their success and the obvious improvements that can be made to physics and AI. Sooner or later someone will step in to show the bean counters how its done.

That $11M in 1977 is worth about $40M on average in today's dollars according to CPI. The cost of going to the movies has increased significantly more than other goods/services, so it's not unreasonable to assume the cost of producing the movies has gone up more than the average as well. Not easy to calculate precisely, but the difference isn't nearly as stark as it seems. I peg it at about double the average, which works out to $80M.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Believe it when I see it. ATI\AMD has been a string of disappointments on the dev relations side for the past 6-7 years.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Interesting, will have to wait to see if it pans out.

The big challenge gpu physics show me as a gamer today, is that currently it's not impressive and bring little to the game and the performance hit is ridiculous for what little it does. This is speaking to physx, perhaps if they actually do this, they will implement it in a way that it can deliver more game-changing elements but use less resources.

Clearly gpu physx has been an absolute failure. In five years there are 16 games that use it, four of which are actually decent. The best example of which is Mafia 2 and it does not bring too much if anything not seen on the CPU and is a huge resource hog.

If AMD can get physics to the point where I can knock down and destroy all objects in the game world and that manipulation then correlates to a direct change in how I now interact with the game world, it will be worthwhile. Plus, it can't require it's own dedicated $200+ card just to be passably playable.
 

thilanliyan

Lifer
Jun 21, 2005
12,039
2,251
126
If AMD can get physics to the point where I can knock down and destroy all objects in the game world and that manipulation then correlates to a direct change in how I now interact with the game world, it will be worthwhile. Plus, it can't require it's own dedicated $200+ card just to be passably playable.

Fully destructible environments were present in Red Faction:Guerilla using Havok. It was pretty cool actually...for example, to destroy a building, you could take it out strategically (at weak points) using very little explosive, or you could go on a rampage and destroy it from the top down. I don't know if it had a lot of fluid/particle simulation though.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Interesting, will have to wait to see if it pans out.

The big challenge gpu physics show me as a gamer today, is that currently it's not impressive and bring little to the game and the performance hit is ridiculous for what little it does. This is speaking to physx, perhaps if they actually do this, they will implement it in a way that it can deliver more game-changing elements but use less resources.

Clearly gpu physx has been an absolute failure. In five years there are 16 games that use it, four of which are actually decent. The best example of which is Mafia 2 and it does not bring too much if anything not seen on the CPU and is a huge resource hog.

If AMD can get physics to the point where I can knock down and destroy all objects in the game world and that manipulation then correlates to a direct change in how I now interact with the game world, it will be worthwhile. Plus, it can't require it's own dedicated $200+ card just to be passably playable.
Well said.

I think developers get lost in using GPU PhysX in that they find something that GPU PhysX can compute and then throw it in the game, regardless of the performance hit trade off. They should be asking themselves "is this the best way to render this effect?" Many times a simple texture will emulate the effect with 95% accuracy with 5% of the performance penalty.

However, this is the way it's going to be until a universal solution is developed.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Interesting, will have to wait to see if it pans out.

The big challenge gpu physics show me as a gamer today, is that currently it's not impressive and bring little to the game and the performance hit is ridiculous for what little it does. This is speaking to physx, perhaps if they actually do this, they will implement it in a way that it can deliver more game-changing elements but use less resources.

Clearly gpu physx has been an absolute failure. In five years there are 16 games that use it, four of which are actually decent. The best example of which is Mafia 2 and it does not bring too much if anything not seen on the CPU and is a huge resource hog.

If AMD can get physics to the point where I can knock down and destroy all objects in the game world and that manipulation then correlates to a direct change in how I now interact with the game world, it will be worthwhile. Plus, it can't require it's own dedicated $200+ card just to be passably playable.

Imho,

Where do you get five years from? All GPU physX, to me, is a way to get the ball rolling, trying to create foundations and tools, to get content in there, so there may be awareness. Innovative ways to try to improve immersion and place the work loads on GPU strengths.

As long as GPU Physics matures, be very pleased but for GPU Physics to shine, much more, one may need to look at trying to establish open standards, so there may be less division so titles can really enhance game play as well as fidelity. Understand the concept of differentiation and it is important but maybe it's time to maybe port this to OpenCL. The key is how much mature is Cuda over OpenCL?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
However, this is the way it's going to be until a universal solution is developed.

This.

It'll be niche so long as it is not an industry standard, for all the obvious reasons.

Just look at hardware in the x86 world, for all the innovation that AMD and Intel do it goes nowhere until they cross-license it and programmers use it. What happened to 3DNow!? What about SSE5? or SSE4.2?

Even with Intel's 80%+ marketshare they still have a hard time convincing programmers to take up their ISA extensions if the programmers aren't reasonably convinced that the ISA will be ubiquitous (licensed and implemented by AMD).

I really scratch my head at Nvidia's strategy with Physx. Of course I scratched my head at ATI with their tessellator/shader push in DX10.1 too.

If Physx doesn't become part of DX12 then I think we can safely write it off as a tech that has already peaked and is just taking its time in the inevitable life-cycle of becoming obsolete. 3DNow all over again.

It doesn't need to become open-standard, DX11 is not open standard, but it needs to become ubiquitous like the DX11 standard is ubiquitous, in my opinion.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
This.

It'll be niche so long as it is not an industry standard, for all the obvious reasons.

Just look at hardware in the x86 world, for all the innovation that AMD and Intel do it goes nowhere until they cross-license it and programmers use it. What happened to 3DNow!? What about SSE5? or SSE4.2?

Even with Intel's 80%+ marketshare they still have a hard time convincing programmers to take up their ISA extensions if the programmers aren't reasonably convinced that the ISA will be ubiquitous (licensed and implemented by AMD).

I really scratch my head at Nvidia's strategy with Physx. Of course I scratched my head at ATI with their tessellator/shader push in DX10.1 too.

If Physx doesn't become part of DX12 then I think we can safely write it off as a tech that has already peaked and is just taking its time in the inevitable life-cycle of becoming obsolete. 3DNow all over again.

It doesn't need to become open-standard, DX11 is not open standard, but it needs to become ubiquitous like the DX11 standard is ubiquitous, in my opinion.
Nope, it needs to become an open standard, or have all the completely necessary aspects implemented in an open standard of some sort (e.g. OpenGL vs DX).

The main concern for most mass market games, and big games, is consoles. You can't do anything with physics if you can't run it on consoles.
DX11 will never run on all consoles. CUDA cannot be guaranteed to run on future consoles (obviously it could if everything is NV powered). Therefore you need something that in some way can be run on all consoles. If you can't make all the platforms run the game the same (excluding any weak consoles which have their own version of a game), you can't reasonably expect many (any) games to have game-changing physics which requires some sort of beyond-CPU acceleration when any such version would be PC only.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
I agree, once both the hardware and open standards become widely available then we will see significant progress. Physics can be a resource hog, but like any other effect programmers often figure out new ways to simplify the equations to produce good results without the same resource drain. Without the hardware and open standards being commonplace the whole process would bog down again and progress would slow to a crawl.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
This.

It'll be niche so long as it is not an industry standard, for all the obvious reasons.

Just look at hardware in the x86 world, for all the innovation that AMD and Intel do it goes nowhere until they cross-license it and programmers use it. What happened to 3DNow!? What about SSE5? or SSE4.2?

Even with Intel's 80%+ marketshare they still have a hard time convincing programmers to take up their ISA extensions if the programmers aren't reasonably convinced that the ISA will be ubiquitous (licensed and implemented by AMD).

I really scratch my head at Nvidia's strategy with Physx. Of course I scratched my head at ATI with their tessellator/shader push in DX10.1 too.

If Physx doesn't become part of DX12 then I think we can safely write it off as a tech that has already peaked and is just taking its time in the inevitable life-cycle of becoming obsolete. 3DNow all over again.

It doesn't need to become open-standard, DX11 is not open standard, but it needs to become ubiquitous like the DX11 standard is ubiquitous, in my opinion.

Nvidia's strategy is pretty clear. Push the technology with their proprietary standard to sell video cards. Once AMD gets off their ass move to OpenCL. Unless of course Microsoft gets off their ass and incorporates physics into DX. Right now Nvidia does enjoy a monopoly over GPU accelerated physics. And until somebody else makes a move will continue to enjoy it.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Nope, it needs to become an open standard, or have all the completely necessary aspects implemented in an open standard of some sort (e.g. OpenGL vs DX).

The main concern for most mass market games, and big games, is consoles. You can't do anything with physics if you can't run it on consoles.
DX11 will never run on all consoles. CUDA cannot be guaranteed to run on future consoles (obviously it could if everything is NV powered). Therefore you need something that in some way can be run on all consoles. If you can't make all the platforms run the game the same (excluding any weak consoles which have their own version of a game), you can't reasonably expect many (any) games to have game-changing physics which requires some sort of beyond-CPU acceleration when any such version would be PC only.

I really dont see why it has to be an "open" standard. I would actually prefer physics be implemented into DX. Overnight we would have a standard that ships on millions of PCs a month.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I really dont see why it has to be an "open" standard. I would actually prefer physics be implemented into DX. Overnight we would have a standard that ships on millions of PCs a month.

1 million PCs a month doesn't mean much when most games, especially big games, are multi-platform...
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Hmm, are you guys going to play a game under OPENwhatever than DirectX? Dynamic tessellation has been on ATI card for years, but people are now so high about it because of Dx11 ...

Even now, I don't see dynamic tessellation is being used in any meaningful way.