• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia Quantum effects physics processing

Sylvanas

Diamond Member
Okay so on closer inspection of the box of my 8800GTX came in, it touts 'Nvidia Quantum effects physics processing technology' now from that I assume that my card can act as a dedicated PPU, right? What is the situation with this? Nowhere have I seen any mention of using an 8800 as a PPU, perhaps it is a matter of driver implimentation, in which case I wonder when we will see it if at all. With the release of UT3 soon which Epic have said will benefit from a PPU (namely Ageia cards, but why not our 8800's) perhaps Nvidia is waiting until the games release to impliment physics processing in its 8 series cards? There are a few motherboards with 3 PCI-E 16x slots so the ability *must* be there....

Perhaps someone more clued in than I can give some insight into 8 series Physics processing?
 
yep Nvidia's site still says:

NVIDIA® Quantum Effects? Technology:
Advanced shader processors architected for physics computation enable a new level of physics effects to be simulated and rendered on the GPU-all the while freeing the CPU to run the game engine and AI.

But makes me wonder as i do not remember any reviews ever testing this functionality. But i believe GRAW 1 and 2 uses Ageia so it isn't like there are not games using PPU's. But more than likely said games have to be coded for Nvidia GPU/PPU support as far as i know no have made physics API standard. There was a rumor MS was going to do with DX10 but they denied the last i heard.

Personally if i was a game developer of today i would start talking with OpenGL community or MS and create a standard. Todays games cost million already to make, lets not make it cost more because of lack of standard. Without a standard they would need to write different physics engine calls using API for Nvidia, ATi, Ageia, and Havok. But i also remember hearing that either Nvidia's or ATi's GPU/PPU can use Havok API and Havok seems to be the must used today.
 
The ability is there because it runs physics simulations as Shader Model 3/4 programs, so yes it's a driver implementation matter. For the time being Nvidia hasn't put it in their drivers, and no one has licensed the Havok FX SDK to use it.
 
Originally posted by: Zenoth
In other words that nVidia's TruForm.

Wasn't TruForm ATI's method of smoothing circular objects. Sort of like turning an "octagon" into a true cirlce? What does this have to do with Physics processing?
 
"Quantum"...what a load 😛

What are they going to call it next year when they have the ability to truly do quantum physics? They better start saving up their names...
 
Wow some of you couldn't catch the meaning of my sentence ? Well yeah not a lot of people know what was TruForm and that it failed. But it was indeed the meaning of it, it's just a matter of reading between the lines. Saying that our current and even next gen cards will process physics regularly, to me, is like saying "we have that great feature that has the potential to change the future of PC gaming with TruForm" and then the next thing you know is that it won't happen do either by the lack of skills from the developers to implement the changes properly, the lack of manpower/resources/money and/or mere laziness.

And I'm not sure that Intel and AMD would appreciated it if their oh-so-powerful next gen CPU's efficiency at processing physics especially on multiple cores would get stolen by GPU's. Let the GPU's do what they're best at, processing graphics.
 
The funny thing about TruForm is that ATI is trying again with the HD2000-series cards.:Q
 
Back
Top