PhysX F.A.Q. Compilation of questions I am finding throughout the forum.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boatillo

Senior member
Dec 14, 2004
368
0
0
One question here - how does the dedicated Physx card behave when not in use? Does it turn completely off, saving power, until 3D drivers wake it up? Or is it running, fan and all, constantly?

I have yet to put in an old 8800GT because the games I play now do not use this, but if it will not generate extra heat/power then I will do so asap.
 

zagood

Diamond Member
Mar 28, 2005
4,102
0
71
Originally posted by: boatillo
One question here - how does the dedicated Physx card behave when not in use? Does it turn completely off, saving power, until 3D drivers wake it up? Or is it running, fan and all, constantly?

I have yet to put in an old 8800GT because the games I play now do not use this, but if it will not generate extra heat/power then I will do so asap.

Interesting. I'd like to know this too. I'm sure it doesn't power down completely, but what is its "idle" draw as compared to its 2D idle specs?

Quick and dirty way to test I guess would be to plug a kill-a-watt into the wall and test with the second card out and in. If someone could do that and note any wattage difference it would be appreciated.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
If the secondary card is set to run PhysX, and there is no PhysX content game being played, the second card should remain idle. Probably just like if you had an Ageia PPU in your system alongside a GPU. A good way to check this out is monitoring the temperatures of both GPU's when firing up a non PhysX game. You can change the settings for controlled results as to which card runs PhysX (whether it's a PhysX game or not).
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
If you use / have access to 3D Studio Max or Maya there are plugins on the nvidia site that will allow you to play with PhysX inside either application. I use them in Max and they are really easy to use. Make a plane and set it to cloth with gravity, then make a sphere set to static and then click the button to simulate and it will show the cloth falling over the sphere in real time , conforming to the sphere and reacting to collisions on itself.

You can do these sort of things normally in Max with cpu simulations, but this is in almost real time.

http://developer.nvidia.com/ob...physx_dcc_plugins.html


Before:
http://i41.tinypic.com/huhu3a.jpg

After:
http://i41.tinypic.com/cosr7.jpg
 

nobodyknows

Diamond Member
Sep 28, 2008
5,474
0
0
Originally posted by: nobodyknows
Originally posted by: keysplayr2003
Originally posted by: SunnyD
Originally posted by: keysplayr2003
The Ageia PPU is the fastest PPU out there. ;)

The Aegia PPU is the ONLY PPU out there.

I know Sunny.

So does it matter if the PPU card is pci-e or PCI?

BFG PhsyX PCI

Asus PhsyX PCI-E x 1

Can anybody answer this?

Will a pci physx card work on a system that uses pci-e video card? I have a chance to buy one and thought I'd play around with physx on my HD4870 card.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: nobodyknows
Originally posted by: nobodyknows
Originally posted by: keysplayr2003
Originally posted by: SunnyD
Originally posted by: keysplayr2003
The Ageia PPU is the fastest PPU out there. ;)

The Aegia PPU is the ONLY PPU out there.

I know Sunny.

So does it matter if the PPU card is pci-e or PCI?

BFG PhsyX PCI

Asus PhsyX PCI-E x 1

Can anybody answer this?

Will a pci physx card work on a system that uses pci-e video card? I have a chance to buy one and thought I'd play around with physx on my HD4870 card.

The PPU's are the same. They just use different interfaces to be useful in a wider range of systems. Some systems do not have a PCI-e 1X slot, so the PCI card can be used.
However, I am uncertain if there will be any performance hits going with the lower bandwidth PCI slot version over the PCI-e 1X version. I am unsure of the bandwidth a PPU needs. So, YMMV.
 

neilganon

Junior Member
Jan 31, 2008
16
0
0
When I try using Physx Mirror's Edge, it says it's unsupported on my card (8800 GTS 512). I have the 181.22 drivers. I tried installing the Phyx system pack thing (which I thought was included with the drivers) and nothing works and I'm confused. I thought it worked on all 8 or higher series cards... any input?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Control Panel. Installed Programs (Vista) Add Remove Programs (XP). Uninstall any Nvidia display drivers, and any installed PhysX drivers.

Install the latest drivers for your card and OS from Nvidia's site. Also, make sure you have all motherboard drivers installed and up to date.

Do you have integrated video on your mobo?
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Hey keys... can you tell me how well PhysX "acceleration" works on a video card in a PCIe 1x slot for a dedicated PhysX GPU? Is there any performance advantage in terms of having a faster slot with PhysX?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: SunnyD
Hey keys... can you tell me how well PhysX "acceleration" works on a video card in a PCIe 1x slot for a dedicated PhysX GPU? Is there any performance advantage in terms of having a faster slot with PhysX?

Hey Sunny..

Do you mean the small (1 inch) PCI-e x1 slots available on mobos? Or a PCI-e x16 size slot?
If its the former, than you will be restricted to an Agiea PPU for PhysX acceleration. AFAIK, there are no GeForce GPU's that utilize the "shorty" PCI-e x1 slots.
If not, what motherboard has a full size PCI-e slot at only 1x speed? Older mobo?

what is your mobo model and make? Need a little more info.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: keysplayr2003
Originally posted by: SunnyD
Hey keys... can you tell me how well PhysX "acceleration" works on a video card in a PCIe 1x slot for a dedicated PhysX GPU? Is there any performance advantage in terms of having a faster slot with PhysX?

Hey Sunny..

Do you mean the small (1 inch) PCI-e x1 slots available on mobos? Or a PCI-e x16 size slot?
If its the former, than you will be restricted to an Agiea PPU for PhysX acceleration. AFAIK, there are no GeForce GPU's that utilize the "shorty" PCI-e x1 slots.
If not, what motherboard has a full size PCI-e slot at only 1x speed? Older mobo?

what is your mobo model and make? Need a little more info.

I'll have to look it up. I know that some current mobos (particularly the new i7 motherboards) allow variable configurations on their PCIe 16x slots... like 16x/8x, 16x/4x/4x, or some silly combination.

I'm really just wondering if a "GPU" used for dedicated PhysX is impacted at all by bus bandwidth.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Well, an Ageia PPU seems to have enough juice with a PCI-e x1 slot. So I'd have to say no. But then again, a GPU is not a PPU, so my answer is just a guess. Most multi PCI-e motherboards these days comes with slots in at least a 1x16x 1x4x at a minimum. And if all the secondary card is doing is running PhysX calculations, then I'd say it'd be a cinch. Because 4x slots have been know to show a little performance degradation when using CF or SLI with higher end cards. 8X is the sweet spot for full blown chaos.

Keep in mind, that I have not done any extensive research on this aspect of dedicated PhysX cards. If you wish to try it out and let us know, that would be very cool. As I do not have the time right now to do any testing at all. Barely enough time to post. Very busy at work.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: keysplayr2003
Well, an Ageia PPU seems to have enough juice with a PCI-e x1 slot. So I'd have to say no. But then again, a GPU is not a PPU, so my answer is just a guess. Most multi PCI-e motherboards these days comes with slots in at least a 1x16x 1x4x at a minimum. And if all the secondary card is doing is running PhysX calculations, then I'd say it'd be a cinch. Because 4x slots have been know to show a little performance degradation when using CF or SLI with higher end cards. 8X is the sweet spot for full blown chaos.

Keep in mind, that I have not done any extensive research on this aspect of dedicated PhysX cards. If you wish to try it out and let us know, that would be very cool. As I do not have the time right now to do any testing at all. Barely enough time to post. Very busy at work.

Unfortunately I'm not privy to the kind of hardware most people here have. My system isn't qualified to do such a "review", nor is my wallet. Unless you're willing to put me in touch with some nvidia marketing folks... (there's a nvidia office right down the street from where I work... hmm...)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Um, Ok. So you were just wondering about a bandwidth limitation just for knowledge and not an implementation? No prob. Since I don't know of any motherboard that has a full size PCI-e x1 slot, I don't think anyone has anything to worry about when it comes to bandwidth limitations for a dedicated PhysX GPU in slot 2.

 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: keysplayr2003
Um, Ok. So you were just wondering about a bandwidth limitation just for knowledge and not an implementation? No prob. Since I don't know of any motherboard that has a full size PCI-e x1 slot, I don't think anyone has anything to worry about when it comes to bandwidth limitations for a dedicated PhysX GPU in slot 2.

I think there were some pseudo sli motherboards early on that had two 16x physical slots, but with two cards plugged in, one stayed at 16x and the other was 1x.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
In an interesting development, Nvidia has announced that PS3 and Wii developers now have access to PhysX.


NVIDIA licenses PhysX technology for Sony's PlayStation 3
by Darren Murph, posted Mar 17th 2009 at 3:22PM

After being on the tip of gamers' tongues last summer, NVIDIA's PhysX technology has cooled a bit in terms of sheer popularity. That said, we've no doubt that the buzz will be back in force after this one clears the airwaves. NVIDIA has just announced that it has nailed down a tools and middleware license agreement for Sony's PlayStation 3, effectively bringing the aforesaid physics tech to what's arguably the most potent game console on the market today. As a result of the deal, a PhysX software development kit (SDK) is now available to registered PS3 developers as a free download for use on the SCEI Developer Network. What exactly this means for future PS3 games remains to be seen, but one's things for sure: it's only up from here.
http://www.engadget.com/2009/0...r-sonys-playstation-3/


NVIDIA's PhysX coming to Wii
Sean Ridgeley - Thursday, March 19th, 2009 | 10:54AM (PT)


Scalable physics, here Wii come

Following up on Tuesday's news of PS3 developers being licensed NVIDIA's PhysX technology, it's announced today game creators familiar with the Wii will (officially) receive the same benefits if they wish, bringing us one step closer to a PhysX-standard game industry.

PhysX, for those who aren't aware, is a toolset for developers who wish to create more realistic physics in games for things like fog, glass shattering, human movement, etc.; it incorporates a fully featured API and physics engine allowing creators to make scalable physics in real time. A good idea of what improvements it offers can be seen with this Mirror's Edge trailer.

[/i]"Nintendo has reshaped the home entertainment and video game market with the success of the Wii console," said Tony Tamasi, senior vice president of content and technology at NVIDIA. "Adding a PhysX SDK for Wii is key to our cross-platform strategy and integral to the business model for our licensed game developers and publishers."

"With NVIDIA PhysX technology, developers can easily author more realistic game environments for the evolving demands of a broad class of Wii gamers."

Are the handhelds next?
http://www.neoseeker.com/news/...s-physx-coming-to-wii/
 

coinz

Senior member
Oct 1, 2004
482
0
0
I got my 8800gt working with my 4870 1gb in win xp and I wanted to find out if anyone else had the same settings.. I have to have the 8800 desktop extended for it to run it doesn't work by just switching to main display and setting it to on and then back..it works perfectly fine but I just wanted to know if there was another way for it to be on..if I turn the extended desktop off it turns off the physx acceleration