• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia confirms PhysX hardware dies

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I wonder how this will affect benchmarking. It seems like the drivers would probably cap frame rates and use the freed resources to improve physics detail. So faster cards may have similar frame rates but better physics calculations - so how will that be measured? We could have adaptive image quality now (extra GPU cycles used when available to improve IQ), but since that would be difficult to express and since higher frame-rates sell more-costly graphics cards, it's not provided. If the GPU physics component doesn't produce a bragging number, it will never have serious support.
 
Originally posted by: Nanobaud
I wonder how this will affect benchmarking. It seems like the drivers would probably cap frame rates and use the freed resources to improve physics detail. So faster cards may have similar frame rates but better physics calculations - so how will that be measured? We could have adaptive image quality now (extra GPU cycles used when available to improve IQ), but since that would be difficult to express and since higher frame-rates sell more-costly graphics cards, it's not provided. If the GPU physics component doesn't produce a bragging number, it will never have serious support.

Doesn't ATI do adaptive IQ with crossfire for older games? Or am I thinking of something else...
 
So the GPU will offload its spare cycles to calculate Physics.... so what happens when the GPU might not have any or very little spare cycles... no physics???
 
Originally posted by: BTRY B 529th FA BN
I love my PhysX card - for certain systems it helps. i've seen improvements in no particular order, COD4, UT3, DODS, COD2, & Red Orchestra. I know what you are gonna say, '...those games weren't programmed for it...' ... in my case, it helps, period.

De Nile isn't just the name of a river in Africa you know.
 
Originally posted by: Drayvn
So the GPU will offload its spare cycles to calculate Physics.... so what happens when the GPU might not have any or very little spare cycles... no physics???


Currently, CUDA(which is what Nvidia is going to use to implement PhysX) does not always work reliably when you run it on the primary display adapter. This is supposed to change soon. I'd imagine that you would then just lose some GPU performance by offloading PhysX processing to the same card.

The whole thing seems like it would be better suited to the WDDM driver model in Vista, which already supports GPU time slicing.
 
Originally posted by: aka1nas
Originally posted by: bfdd
Originally posted by: bfdd
Originally posted by: Schadenfroh
Maybe nvidia will buy Bigfoot Networks next?

That'd be pretty cool actually to have an onboard NIC like that ;P

Also, I think Intel buying HaVoK will do more for games than GPU's having PhsyX code onboard.

Intel buying Havok was mostly to kill off it's GPU-based Physics products. I somehow don't think that Ageia was Nvidia's first choice for acquisition. 😛

That's what I was trying to get at.
 
Next from Nvidia: Octo-hybrid-full-hd-av-SLI for 4 dimensional mind-bending physics, full HD sound via HDMI and quad-pumped graphics processing with support for 16 displays via displayport. Only requirements are a 3KW PSU, 8 PCIe slots and 16 PCIe connectors!!!
 
Originally posted by: Chosonman
Originally posted by: BTRY B 529th FA BN
I love my PhysX card - for certain systems it helps. i've seen improvements in no particular order, COD4, UT3, DODS, COD2, & Red Orchestra. I know what you are gonna say, '...those games weren't programmed for it...' ... in my case, it helps, period.

De Nile isn't just the name of a river in Africa you know.

Yuk Yuk Yuk :laugh:

 
Originally posted by: Drayvn
So the GPU will offload its spare cycles to calculate Physics.... so what happens when the GPU might not have any or very little spare cycles... no physics???

What happens to *everything* when it runs out of cycles ... anyway

Chug .. a . . . chug .. a..chug .. chhChug-chug-chug!
😕
 
So the GPU will offload its spare cycles to calculate Physics.... so what happens when the GPU might not have any or very little spare cycles... no physics???
nVidia will tell you to buy another GPU. 😉
 
another *identical* GPU 😛

we still can't mix and match ... to get by with "last years" in the extra slot ? 🙁
---the performance leader is kinda expensive - and a little "behind" in some "features" ... maybe by GT200?

that will be a real reason to upgrade .. or perhaps they got it now
 
Back
Top