Where's your list? Of course I don't have a list of all 25 games as I'm only quoting the article, but I'll take the pic of Fud standing next to NV's VP of Content Relations at Nvision as more definitive proof than your claim of BS.....Originally posted by: SSChevy2001
25 games before then end of the year is BS. List them like I did in the previous post. You can't cause there isn't 25 games than will use GPU Physx. Where talking games that are developed for GPU Physx in mind.
When comparing CPU performance with hardware PhysX options enabled the difference in FPS is plainly obvious.....like 40 FPS compared to <10 was it? You're not going to see a gain in FPS when enabling hardware PhysX compared to no PhysX at all, but the point is you're getting more visuals and better physics in exchange for lower FPS. This isn't any different than accepting lower FPS by increasing detail or AA levels in-game.I does matter if they don't add anything new, not even added FPS. How many FPS more do you get in a UT3 regular map, by using PhysX? Or Gears of War or Mass Effect or ...... Why list games for the hell of it, just because they had software PhysX and offer GPU acceleration that isn't needed?
Its actually very obvious, especially when you know what to look for and are actually looking at the game in full res and not a tiny SS. Crysis isn't a great representation because it looks great even at Medium settings so the differences with higher settings aren't as pronounced as other games. DX9 actually ran slower on my 8800GTX up until the 174-series, but even now I'm going to run DX10 over DX9 even with the slight drop in FPS. Also, the workload is not the same, its plainly obvious just by running the dev console as DX9 uses much more RAM than DX10, probably because it uses more static textures.To have to compared screenshot vs screenshot it's not that obvious, and doesn't warrant having 20% FPS drop. Also DX9 was faster in every other mode, while the workload should be the same.
Not sure why you'd feel insulted or why its better for SLI MB owners. Nvidia is adding functionality and value where there was none before.Personally I feel that Nvidia offering multi-gpu Physx on intel chipsets insulting, while it's great for SLI MB owners. Not that it matters cause right now a single GTX280 is most CPU limited with Physx.
Yes that's obvious the CPU is still going to be important, but here's what it comes down to:The point is CUDA still does alot of work in the CPU. Currently Badaboom eats about 30% of my quad core on a encoding, which clearly shows CUDA still requires a good amount of CPU usage. PhysX will also eat more CPU usage because of CUDA, and it clearly shows ingame. It's the point that the PPU handle PhysX with less of a CPU hit.
1) You get more effects and visuals with hardware PhysX, even if it comes with a performance hit compared to no PhysX or software-only PhysX.
2) GPU-accelerated hardware PhysX performance absolutely destroys CPU-accelerated hardware PhysX. Simply put, a CPU is not adequate in accelerating hardware PhysX effects.
Using your Badaboom example, you'd be looking at 100% quad core usage and 10x longer encode times if you tried to do the same workload on the CPU only. And of course both of these will be slower than not doing any encoding at all......
I'm guessing you haven't even bothered to load up any of the demos or don't own either of the two games that use hardware PhysX. Even that extra debris and smoke is more impressive and immersive than any recent developments in the GPU industry.I guess it's just the fact that the GPU being used for something more than games is more attractive right now vs some extra debris and smoke.