Plea of a Lurker: Don't let Physics Cards Fail

Chocolate Pi

Senior member
Jan 11, 2005
245
0
0
I typically read obscene amounts of topics on this board without posting. From the AEG to CRTs, from NV40 to R600, from shimmering to jaggy shadows, I lurk and enjoy the proceedings.

But this latest issue of GPU accelerated physics I am unable to stay quiet about. The majority attitude, it seems, is that Havok and nVidia has arrived to save us from the impending doom of big mean AGIEA forcing "yet another $300 component" down everyone's throats. Numerous clarifications need to be made.

Primarily, people seem to have the notion that this is Havok + nVidia vs. ATI + AGIEA. This is absolutely not the case. ATI was the first to voice promise of GPU-assisted physics. (Who did what first is insignificant and will not be discussed in this topic, BTW) Havok might as well not be in the situation at all, it is ATI vs. nVidia vs. AGIEA. Keep in mind that Havok even admits their hardware accelerated design SHOULD work on AGIEA's card if designed properly. It comes down to the fact that these are the companies who want you to buy something to do "the physics thing".

The unavoidable fact of the matter is, a dedicated piece of hardware such as a physics card should be VASTLY superior to a make-shift adaption of hardware designed for a comepletely different purpose. GPUs are superior to CPUs at this task apparently, but from what we have seen a PPU ought to blow both completely away. GPUs makers can't have that though, not when they see a golden opportunity to encourage new product adoption at higher than ever before rates. In fact, this idea of scalable, managed resources fits perfectly with ATI and nVidia's ultimate goal of everyone buying multiple GPUs.

The GPU makers will insist that GPUs are a better investment, as multiple GPUs can also be applied to increased frame-generating horsepower or superior AA. This is marketing at its worst, a desperate attempt to cash in on the physics hype by making everyone overlook the fact that additional GPUs are a much more costly and much less powerful solution. This isn't to say that the technology behind it is a waste, however. Quite the opposite, the concept of GPU accelearated physics is quite possibly what will make the other unfeasible PPU successful. See, the PPU concept want the industry to jump straight from 100% software to high-end of specailized, dedicated acceleration hardware. GPU accelerated physics allows for the badly needed middle ground between these two, the bridge that will force developers to start coding hardware-assisted physics so that the PPU can flourish. More specifically, as dual-core chips and dual GPU cards increase in numbers, we will have multiple tiers: Shared CPU -> Deticated CPU -> Shared GPU -> Deticated GPU -> Deticated PPU. This will establish a development structure that allows for scalible physics, a development structure that the PPU needs to exist.

Before I close, I would like to address anyone who still opposes a dedicated piece of hardware for physics: no possible outcome will generate a market where physics cards are required to play games. Furthermore, why do you oppose hardware functions being independent? Go buy a laptop where everything is one piece and be happy. Upgrading is for nerds, after all, right?

-An annoyed lurker
 

Chocolate Pi

Senior member
Jan 11, 2005
245
0
0
I don't think it is FUD, again, the technology provides the important middle ground. However, it is being marketed and talked about as the competitor to the PPU. "Sorry AGEIA, go home, we don't need you now!" The concept of GPU-assisted physics competes with the PPU about the same as the X1600XT competes with the 7900GTX. You can kinda compare them if you need to but there is no point, as they are vastly different in price and power.
 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
Originally posted by: Chocolate Pi
I don't think it is FUD, again, the technology provides the important middle ground. However, it is being marketed and talked about as the competitor to the PPU. "Sorry AGEIA, go home, we don't need you now!" The concept of GPU-assisted physics competes with the PPU about the same as the X1600XT competes with the 7900GTX. You can kinda compare them if you need to but there is no point, as they are vastly different in price and power.

Thats what i mean by FUD they are trying to trick people that they dont need AGEIA but but it would be comparing low end to high end .
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Chocolate Pi
Right, sorry for misunderstanding. That's one reason I don't post much! :p

That and the fact that you are always eating chocolate pie.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Now, this is only a what if but, what if ATI and nVidia add a dedicated physics coprocessor, either on die or an added chip to it's GPU's? This is where I see ATI and nVidia headed. I am not sure Aegia will survive with another expensive card to add to gaming. I'm of the opinion that ATI and nVidia will add instructions to their GPU cores, which are already massively parallel to handle physics routines. It won't be as good as the Aegia chips initially but it'll boost it by a bit and that's what most developers will be using. Eventually both ATI and nVidia's PPU coprocessors will become more and more robust.

It's not that I don't think the Aegia PPU will be a dog performance wise, but with the cost, I don't think any but the hardcore will buy it. I think that the incremental PPU boost from ATI and nVidia will be what most developers will eventually be programming for and that much it will become more powerful with time, and a couple years from now, will be as powerful as Aegia's PPU is now.
 

Chocolate Pi

Senior member
Jan 11, 2005
245
0
0
I'm not pro-AGIEA, I'm pro-dedicated hardware. If ATI and/or nVidia make their own dedicated PPU, on their own boards or bundled as co-processors, I see no difference from AGEIA's product. It's the makeshift adoption of GPUs as a replacement, however surprisingly adaptable they are, that I have a problem with.
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
imo this is a great idea, it just doesnt make any sense at all to have a dedicated physics processor. GPUs have the same type of ALUs needed to do the work that a physics processor would do, so it makes better sense to put it all on the same card. NVIDIA and ATI can both add additional hardware to their GPUs to optimize their use as physics processors.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Chocolate Pi
I'm not pro-AGIEA, I'm pro-dedicated hardware. If ATI and/or nVidia make their own dedicated PPU, on their own boards or bundled as co-processors, I see no difference from AGEIA's product. It's the makeshift adoption of GPUs as a replacement, however surprisingly adaptable they are, that I have a problem with.

Well, I have concerns about locking people into dedicated physics hardware that could keep game designers from trying to explore other options for doing physics. So... maybe you should consider that your viewpoint might not be the only viable one.
 

Chocolate Pi

Senior member
Jan 11, 2005
245
0
0
Explore other options? Like what? Again, no market could possibly develop in the short term where PPUs are anything but a high-end solution. Developers will never be "locked in" to PPUs any more than they are "locked in" to high end GPUs.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Chocolate Pi
Explore other options? Like what?

Like using a GPU to do physics calculations, for one. :p

Again, no market could possibly develop in the short term where PPUs are anything but a high-end solution. Developers will never be "locked in" to PPUs any more than they are "locked in" to high end GPUs.

My concern is that if the "PPU" basically ties you to a specific API and set of capabilities, it could mean that every game would have to use that same physics API to have access to the extra hardware. Depending on what that API is capable of doing, it could limit the way that such processing capabilities are actually used in games.

We've gone through a lot of effort over the last few years going from fixed-function GPUs to programmable ones; I don't want to see games locked into 'fixed-function PPUs' either if it can be avoided. Plus, it's another (potentially expensive) piece of hardware; if you can get a chunk of the benefit using spare GPU cycles (and/or taking a hit in framerate) but at no other extra cost, I think that's worth looking into.
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Well I'm glad you posted this as I didn't want to thread crap in the other "yeah Nvidia is saving us from having to buy more hardware" threads. This is just Nvidia and ATI trying to weasel in on the inovation that AGEIA already started. THEY ARE NOWHERE CLOSE TO BEING ABLE TO DO WHAT THEY ARE SAYING THEY WILL SOON BE ABLE TO DO. But telling people to wait, don't throw your money away on a PPU but buy into SLI or Xfire and get a half baked lame excuse for physics. Marketing gobably goo.

Now MAYBE in 2-3 years time they could get the 2 or 4 or 6 graphics cards to process physics WELL and push pixels and do this all at good frame rates. But this year if I buy into the physics scene at all, I would rather go with a real piece of dedicated hardware and not a second graphics card to get watered down results....But people with SLI or Xfire alredy might feel differently and wait it out.

Being that games are still a ways off on really using either solution to its full extent waiting to see how both develop might be the best bet.
 

Chocolate Pi

Senior member
Jan 11, 2005
245
0
0
I entriely agree that we should look into GPU-acceleration of physics, as we need that to serve an an established mid-range solution. This is exactly what the vast majority needs, including many of the members of this forum and possibly myself. However, my post (which is admittibly not as clear as it could be) is to emphasize that you shouldn't throw out the 7900 because the 7600 is cheaper and more accessible, so to speak.

As far as API's go, the possibility of Havok running on AGEIA's own PPU is very promising, and we can only hope AGEIA doesn't shoot themselves in the foot by eliminating that possibility. Again, I am not pro-AGEIA, a "non-partisan" PPU or co-processor by ATI or nVidia would provide an equally beneficial high-end solution.

 

Golgatha

Lifer
Jul 18, 2003
12,396
1,068
126
Originally posted by: ArchAngel777
Originally posted by: Chocolate Pi
Right, sorry for misunderstanding. That's one reason I don't post much! :p

That and the fact that you are always eating chocolate pie.

Pi are squared. No idiot, pie are round!
 
Mar 19, 2003
18,289
2
71
Good post...I agree with you for the most part. I like that they're exploring the option of doing physics calculations on the GPU, but almost everything is already GPU-limited if you're not still playing at 1024x768 or lower resolutions...I'm not sure that I'd want to take a FPS hit for nicer physics. I can't possibly see how they could do this on GPU's without impacting graphics performance. It might be nice to have the option if you didn't have to spend any extra money, but I still think there should be a dedicated hardware solution...
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Ultimately, MS needs to add a physics API of some sort to a future version of DirectX, and then game developers can just code to that, and then Havok or Ageia or Nvidia or whoever can just write drivers to offload that code to the GPU or a specialized card.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: the Chase
MMmmmm..CORNBREAD.....

I used to love the school cornbread when I was in elementary school. They made it pretty good. Able to see the Hollywood sign every morning from my school building. Then I moved to the east coast and the school food tasted like cardboard.
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
First of all, doing physics on a GPU has been around for awhile. The name for this broad class of problems is [L=GPGPU]http://www.gpgpu.org].
Secondly, it is all just research and development right now. Even AEGIA is nothing more than R&D. Until it is available on the market for use with
an API to write against, it is all R&D.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: lifeguard1999
First of all, doing physics on a GPU has been around for awhile. The name for this broad class of problems is [L=GPGPU]http://www.gpgpu.org].
Secondly, it is all just research and development right now. Even AEGIA is nothing more than R&D. Until it is available on the market for use with
an API to write against, it is all R&D.

You can already program with Novadex, Ageia's API. It does have a software mode to fallback on, which apparently is now multithreaded. It's just hardware that is vaporware atm.
 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
Inq Just Called BS on Nvidia SLI physics.
Split time on card between physics and Graphics whill hurt both video and physic profromance.
http://www.theinquirer.net/?article=30434
"So, the SLI physics engine can make water fall, ripples move, and rocks from the exploding cliff wall bounce off each other in a way that would be damn near impossible to do on a CPU. It can't however make those things interact with your player, the ripples may look like they are washing over your legs, and the rocks bounce off your shins, but it won't cause in game collisions.

The Ageia way of doing things is the 'real deal', it can make things bounce, cars crash, and have the game work with it. The gravity of the asteroid can pull your ship off course at a critical time just before the warp jump in ways that SLI physics, in V1.0 at least, can't.

So, the Nvidia approach is smoke and mirrors: pretty, shiny, and ineffectual. The kind of thing you want to do in demos, but not in games. Ageia claims to do it 'right' out of the box, and I think this will pay off in a much greater way for the average gamer. SLI physics is not by any means useless, but I would categorise this release a special effect, not a simulator. Perhaps V2.0 will fix all of this, most likely by the time of the Game Developers Conference 2007."
 

the Chase

Golden Member
Sep 22, 2005
1,403
0
0
Wow that was the most eloquent story the Inq has ever posted!! You could actually understand all of it! What happened??