PhysX previews

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
No minimum FPS, shame on all of them. Only elitebastards got it right to some extent.

"There's no doubt that, at the end of the day, accelerating PhysX effects to any real degree via the GPU is far, far more effective than using the CPU, and indeed far better that AGEIA's own Physics Processing Unit which often got a lot of flak for its own lacklustre performance. Of course, that isn't to say that the situation is perfect, and even the high-end parts we used for our testing struggled at times, in Unreal Tournament 3 in particular. Indeed, in scenarios where a lot of physics effects and debris are being thrown around, you could argue a kind of chicken and egg effect - Physics processing on the GPU takes resources away from 3D rendering, but those physics effects create debris and the like and thus a lot more objects on screen, which requires more 3D rendering horsepower, which is being taken up by physics processing, and so on."

I foresee that during physx intensive moments you will get horrible minimum framerates, when using a single videocard for both 3d acceleration AND physx. Look at keysplayer his benchmark for example, using a 9800gtx+, his minimum framerates with edge detect and 8xAF @ 1280*1024 still suck when using Physx, I bet those minimum framerates are during physx intensive moments.

Elitebastards also noted, that if radeon cards can't run them, it's a big gamble to make games that actually REQUIRE special hardware ( nvidia gpu's in this case ) to run properly, because you loose all the AMD card owners who can't play those games.

Things I did like is a new form of SLI where you can run your old nvidia card alongside your new nvidia card, and have the old card do the physx part. You can see in one sites benchmarks that the 8800gts 640mb still got some pretty good framerates, and it might actually be sufficient for most physx games. No SLI board is needed, BIG plus, the fact that you do need a second monitor in Vista kind of sucks though ... And, of course it eats power, so you will need a SLI capable PSU. BIG QUESTION IS, can you run a nvidia videocard, alongside a AMD videocard, and have the AMD card do the 3d acceleration, and have the Nvidia card do the physx !?
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: MarcVenice

Elitebastards also noted, that if radeon cards can't run them, it's a big gamble to make games that actually REQUIRE special hardware ( nvidia gpu's in this case ) to run properly, because you loose all the AMD card owners who can't play those games.

Things I did like is a new form of SLI where you can run your old nvidia card alongside your new nvidia card, and have the old card do the physx part. You can see in one sites benchmarks that the 8800gts 640mb still got some pretty good framerates, and it might actually be sufficient for most physx games. No SLI board is needed, BIG plus, the fact that you do need a second monitor in Vista kind of sucks though ... And, of course it eats power, so you will need a SLI capable PSU. BIG QUESTION IS, can you run a nvidia videocard, alongside a AMD videocard, and have the AMD card do the 3d acceleration, and have the Nvidia card do the physx !?

Not in Vista.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I think the downloadable PhysX software pack is a nice bonus. Comes with a full game. Now I just need to find a cheap copy of UT3
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Not in Vista? Explain ? I think you ment not in XP? Most 'hardcore' gamers are on Vista 64x nowadays though. In fact, Vista's 64 bit sales have shot up big time compared to Vista 32x. Nvidia HAS to make it work.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: MarcVenice
Not in Vista? Explain ? I think you ment not in XP? Most 'hardcore' gamers are on Vista 64x nowadays though. In fact, Vista's 64 bit sales have shot up big time compared to Vista 32x. Nvidia HAS to make it work.

CUDA still requires Forceware to be installed(even on Tesla cards that do not have display outputs) and you can't run multiple WDDM drivers on Vista.
 

Raider1284

Senior member
Aug 17, 2006
809
0
0
Originally posted by: deerhunter716
Software PhysX is worhtless this shows as it take so MUCH FPS away, lol What a waste.

wtf are you talking about?! The framerates were almost always tripling when enabled Phyx. How is getting three times the framerate a waste?! did you even read the articles?

how about this real world test, enabling physx gives over 400% INCREASE in fps http://www.elitebastards.com/h...charts/ut3-tornado.png
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
Originally posted by: deerhunter716
Software PhysX is worhtless this shows as it take so MUCH FPS away, lol What a waste.


LOL...take another look at those articles.

I think thats just a canned response maybe?
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
The only good reviews are Driverheaven & Techreport.

Firingsquad UT3 Heat Ray Physx score looks wrong. My 8800GTS gets about 40FPS @ 1080P maxed settings with 31bots.

Elite Bastards UT3 scores also look wrong. I get about 28fps @ 1080p with 4xAA enabled. I doubt the difference in pixels is going drop the FPS much.

Downloading Warmonger now to see how that plays, but still physx adds some nice effects. Question is when more demanding games come will these cards be able to handle both physx and gpu rendering? The FPS now aren't that high even with a GTX280.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: deerhunter716
Software PhysX is worhtless this shows as it take so MUCH FPS away, lol What a waste.

:roll:

I don't think you actually understand what the graphs show. All of the games reviewed in the articles linked above ALWAYS use software PhysX(including UT3). The "PhysX" maps just have more effects in them than the standard ones.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,816
126
The problem with most (all?) of these reviews is that they stick to low resolutions and low AA levels, where the GPU has plenty of power to spare for physics, so of course things will get faster in those situations.

However for those of us that run high resolutions with high AA levels and are hence GPU bound, GPU physics are going to slow things down because the GPU will be loaded even more.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: deerhunter716
http://www.techreport.com/articles.x/15261/3

So drop from 66 to 11 is not huge, huh ROFL

Sorry but I need to more of these - just does not make a lot of sense when some benches show exact opposites.

The "No PhysX" bar in the graph is actually a different map that comes standard with UT3. It still uses software PhysX, it just has substantially less effects going on. The other two bars are from the "PhysX" version of the map, which really just has a vastly greater number of effects going on(i.e. more things can break, more realistic debris, the Tornado in the one map). Comparing the first map to the second one really isn't an Apples-to-Apples comparison.
 

shangshang

Senior member
May 17, 2008
830
0
0
Originally posted by: BFG10K
The problem with most (all?) of these reviews is that they stick to low resolutions and low AA levels, where the GPU has plenty of power to spare for physics, so of course things will get faster in those situations.

However for those of us that run high resolutions with high AA levels and are hence GPU bound, GPU physics are going to slow things down because the GPU will be loaded even more.

They stick to lower resolution and non-AA or low-AA because the variable they're testing is Physx. They needed to isolate that variable and test just that variable to see how changing a single variable (Physx in this case) can affect the outcome.

Now that we know that Physx does contribute significantly to FPS, then the goal now is to build more powerful GPU or GPUs that will allow Physx to stretch its leg.

I say give Physx another year and we will see Physx starting to flex its wing as the next generation of GPUs hit the market. I don't understand why people would slight or belittle NV because Physx has not delievered to them a "miracle pill" in gaming.

It's rare that a technology acquisition can be integrated into a main product line so quick, and even rarer that the acquired technology is then made BETTER in the final integrated product. Synergism is what all engineers and management aim for. I think if NV can deliver Physx and live up to all the Physx hype, then management and engineers get an A in my book.

Bring it on!
 

dadach

Senior member
Nov 27, 2005
204
0
76
so wil standalone PCI Physx card work and will it be useable in the future?
 

Synomenon

Lifer
Dec 25, 2004
10,542
6
81
Ok, so I have a 9800GX2 in my single PCIe 16x slot. I do have PCIe 1x slots. Can I use a PCIe 1x card to do the PhysX stuff?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: shangshang
Originally posted by: BFG10K
The problem with most (all?) of these reviews is that they stick to low resolutions and low AA levels, where the GPU has plenty of power to spare for physics, so of course things will get faster in those situations.

However for those of us that run high resolutions with high AA levels and are hence GPU bound, GPU physics are going to slow things down because the GPU will be loaded even more.

They stick to lower resolution and non-AA or low-AA because the variable they're testing is Physx. They needed to isolate that variable and test just that variable to see how changing a single variable (Physx in this case) can affect the outcome.

Now that we know that Physx does contribute significantly to FPS, then the goal now is to build more powerful GPU or GPUs that will allow Physx to stretch its leg.

I say give Physx another year and we will see Physx starting to flex its wing as the next generation of GPUs hit the market. I don't understand why people would slight or belittle NV because Physx has not delievered to them a "miracle pill" in gaming.

It's rare that a technology acquisition can be integrated into a main product line so quick, and even rarer that the acquired technology is then made BETTER in the final integrated product. Synergism is what all engineers and management aim for. I think if NV can deliver Physx and live up to all the Physx hype, then management and engineers get an A in my book.

Bring it on!

No, the problem is that they're pimping Physx as a major selling point when for most users it isn't going to be something practical. What we really need is the possiblity of offloading physics processing to a secondary card much like what AGEIA had in mind from the beginning, and what ATI had in mind with Havok FX. If I can upgrade my video card to do most of its work processing video, and keep my old video card to handle physics processing (a plan that dates back a few years, it just has yet to be put to practice), then I'll be interested. Until then BFG is correct - games are going to keep increasing in complexity and processing demand, and I'd think even wager that standard resolutions will increase with the advent of OLED, GPU technology certainly isn't going to pass up software development to be able to have the luxury of do-it-all single GPUs...not when they've already been playing catchup with titles like Crysis.

While it is impressive how much faster the GPU can be over software physics, it just isn't practical at the moment.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
The most mind-blowing tidbit I picked up from those articles was this from Tech Report:

Nvidia counts 70 million GeForce 8 and 9 users so far, which is probably quite a bit more than the installed base for PhysX cards.

That's an insane number of DX10 capable unified shader architecture parts out there. To put that into perspective, PS3/Xbox360 installed user-base is something like 12 and 18 million respectively.

Otherwise, the previews did a great job showcasing how much PhysX could help improve gameplay and how GPU-acceleration could drastically improve performance. The idea of being able to use older cards as PhysX GPUs is particularly intriguing. Especially this comparison that showed the impact of various combinations including mix and matched pairs:

FiringSquad's PhysX SLI comparison
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: bunnyfubbles


No, the problem is that they're pimping Physx as a major selling point when for most users it isn't going to be something practical. What we really need is the possiblity of offloading physics processing to a secondary card much like what AGEIA had in mind from the beginning, and what ATI had in mind with Havok FX. If I can upgrade my video card to do most of its work processing video, and keep my old video card to handle physics processing (a plan that dates back a few years, it just has yet to be put to practice), then I'll be interested.

You know that the new PhysX driver allows you to do this, right? You get to pick which display adapter will process PhysX. The guru3d review even tested it with a GTX 260 doing rendering and PhysX vs adding a 9600GT dedicated for PhysX.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: chizow
The most mind-blowing tidbit I picked up from those articles was this from Tech Report:

Nvidia counts 70 million GeForce 8 and 9 users so far, which is probably quite a bit more than the installed base for PhysX cards.

That's an insane number of DX10 capable unified shader architecture parts out there. To put that into perspective, PS3/Xbox360 installed user-base is something like 12 and 18 million respectively.

Otherwise, the previews did a great job showcasing how much PhysX could help improve gameplay and how GPU-acceleration could drastically improve performance. The idea of being able to use older cards as PhysX GPUs is particularly intriguing. Especially this comparison that showed the impact of various combinations including mix and matched pairs:

FiringSquad's PhysX SLI comparison

I wonder how many out of that number are 8400s and 8500s in OEM boxes? Those only have 16 shaders, so I doubt they will handle PhysX that well.