AnandThenMan
Diamond Member
- Nov 11, 2004
- 3,991
- 627
- 126
Nvidia claims PhysX is not proprietary because it runs on multiple platforms. Given that, I would not take anything NV says about PhysX seriously.
Nvidia claims PhysX is not proprietary because it runs on multiple platforms. Given that, I would not take anything NV says about PhysX seriously.
Nvidia claims PhysX is not proprietary because it runs on multiple platforms. Given that, I would not take anything NV says about PhysX seriously.
PhysX, isn't entirely limited to nVidia hardware -- with titles like Arma 3 and Bioshock Infinite; it's much more than just the GPU component!
It isn't limited to NV hardware. It can be licensed by competitors. IMHO AMD is intentionally holding the industry back. PhysX would be much further along right now if AMD licensed PhysX from Nvidia, which they have every ability to do and cost restraints have been shown to be non-existant.
What it boils down to:
Nvidia users get PhysX.
AMD users do not. Why is that?
IMHO Nvidia is intentionally holding the industry back. Why won't they submit PhysX to a standards body, or better yet open source it. I don't think anybody can complain about AMD until Nvidia makes PhysX available to everyone, no strings attached.
NV just needs to make their GPU accelerated Physx software work with the standards the industry is using, so any Intel, Nvidia or AMD hardware can run, I think the game developers adopting Physx should be asking Nvidia to make that work
it started with Ageia trying to sell those PCI card
the CUDA optimized effects NVIDIA uses are a complete disaster in terms of performance running on the CPU
As for CPU execution of PhysX effects, CPU can operate particle debris with surprisingly decent performance (40-50 fps),
Not exactly the behavior of a company who wants to differentiate themselves from the competition all that much more. Are you in the practice of creating technology so your competitors can benefit without paying royalties or licensing fees? Let me know, I'll be right over with a shopping cart.
Was very glad to see strong CPU performance so many more gamers could potentially enjoy the debris particles.
Apple seems to be doing pretty well at it.
Is this for Call of Duty's cutting-edge next-gen must have feature; Fish AI ?
http://www.youtube.com/watch?v=yRriF6Pu1kk&feature=c4-overview&list=UUaupSIOToYMVsMygUA8lvwQ
http://www.youtube.com/watch?v=KJIIgdSW6O4
BF4 is going to dump all over this game. I think IW is resorting to bringing nvidia in to try stay relevant tech-wise against the Battlefield 4 juggernaut.
http://www.youtube.com/watch?v=3_xaIv7Wo1A
It's a little sad.
Indeed! At 299 dollars for this luxury -- and now added value to GeForce! How dare nVidia do this!
Don't see the humor or understand the point! Borderlands 2 -- DirectX 9 -- playable with a 5770 -- was harped by nVidia -- and this title had a very welcomed dynamic addition with PhysX!
you can pay $1K for a Titan, if you commit the sin of using something else for rendering in your game, Nvidia wants your dedicated PhysX card to act more like a... brick.
yes, the Ageia card was expensive, and suffered with lack of games (still a big problem for Nvidia)
opening GPGPU PhysX more would help to improve this, but as I said... I'm not seeing GPU physics as something that amazing for the next few years...
as for the rest of your post:
Borderlands 2 running the GPU optimized effects on the CPU was a disaster... and with heavy action (mainly with MP), even high end NVidia cards had performance issues for physx + higher settings
Borderlands 2 running the GPU optimized effects on the CPU was a disaster... and with heavy action (mainly with MP), even high end NVidia cards had performance issues for physx + higher settings
you can pay $1K for a Titan, if you commit the sin of using something else for rendering in your game, Nvidia wants your dedicated PhysX card to act more like a... brick.
yes, the Ageia card was expensive, and suffered with lack of games (still a big problem for Nvidia)
Borderlands 2 running the GPU optimized effects on the CPU was a disaster... and with heavy action (mainly with MP), even high end NVidia cards had performance issues for physx + higher settings
Apple seems to be doing pretty well at it.
Which one? Differentiating themselves or creating technology for others to benefit? Because AFAIK, Apple doesn't actually "create" anything.
PhysX in BL2 is currently screwed up because the game engine has issues with handling multiple threads, and not because of PhysX specifically....or so NVidia says.
And I'm likely to believe them, because I get stuttering and lag in certain areas in single player even with my dedicated PhysX card (a GTX 650 Ti), and my rig is powered by an overclocked 3930K.
Disabling HT and running on four cores is supposedly offers the best performance.
Here's a partial list
- Webkit
- OpenCL
- Darwin
- Bonjour
All available without licensing fees or royalties, and used by Apple's competitors. Why won't Nvidia do the same I wonder?
it would render all seventy billion of your "examples" moot.
Not exactly the behavior of a company who wants to differentiate themselves from the competition all that much more. Are you in the practice of creating technology so your competitors can benefit without paying royalties or licensing fees? Let me know, I'll be right over with a shopping cart.
