For me, PhysX is kind of mandatory for any PC games nowadays.
Otherwise, games are just...texture painted on polygons. It really adds to the experience, and one big way that PC games go beyond being just higher-rez versions of console games.
I agree that having a physics engine is good. But I guess the issue I have is are you saying you want a physics engine instead of no physics engine? Or are you saying that of all the physics engines that may be used, I need to have PhysX specifically?
I just wonder, what if the game used, say, Intel's Havok physics engine instead of PhysX. Would you be just as pleased?
My impression is that some people are arguing the pros/cons of whether to have a physics engine at all, and the additional effects. Then other people are arguing whether a physics engine should be proprietary or not. I guess it seems like two separate issues, and the poll doesn't seem to be able to capture that, or maybe it just conflates both answers and muddies the results?
I agree that having a physics engine is good. But I guess the issue I have is are you saying you want a physics engine instead of no physics engine? Or are you saying that of all the physics engines that may be used, I need to have PhysX specifically?
I just wonder, what if the game used, say, Intel's Havok physics engine instead of PhysX. Would you be just as pleased?
My impression is that some people are arguing the pros/cons of whether to have a physics engine at all, and the additional effects. Then other people are arguing whether a physics engine should be proprietary or not. I guess it seems like two separate issues, and the poll doesn't seem to be able to capture that, or maybe it just conflates both answers and muddies the results?
You are repeating nvidia's false dilemma fallacy... "either nvidia officially supports feature X, or they write special DRM code to prevent its use".
This is a really poor false dilemma for them to set up because there are countless things they do NOT officially support that they haven't bothered DRMing against; by using this false dilemma they imply that anything they haven't specifically DRMed against is fully supported by them, a very bad position
By increased adoption in the context of a game adding it?
Also NV sponsoring games != market acceptance. It's like saying gamers love amd due to the titles in ge increasing.
I just found it to be a double standard as I mentioned in the example.
I wouldn't have said anything but reading a few pages back he was telling others to prove subjective statements.
Realism and immersion are not the same, or else every movie would have been made in the same way that Skyfall was made (crashing tons and tons of cars for the takes). The trend is in the opposite direction, cheating realism to create a different kind of immersion.
Some scenes like the famed Battlefield 3 "blow up the whole side of a Hotel" animation will probably always be more impressive (and easier to pull off) as a scripted scene compared to a dynamical one, simply because the game relies on a very exact result of your actions to make the reaction of the accompanying NPCs more believable.
Be careful with that statement, The Witcher 3 may still use Havok after all and only use PhysX effects. Havok isn't just a physics engine, they can also supply AI, a scripting engine and other tools to complement your inhouse engine development..
However, I think it's better from a strictly business perspective for NVidia to disallow hybrid PhysX setups, because it can be used as leverage to influence people to abandon AMD and go all the way with NVidia seeing as it's the only physics middleware with GPU support.
All it is doing is driving people away from physX. you are describing the result of levering a monopoly, but you must have an actual monopoly before you can leverage it to drive out competition. Doing it too early merely sabotages your attempt to form a monopoly.
The ideal strategy would have been to allow it for now, and when physX is 90% of the market find an excuse to disallow it and watch AMD crash and burn.
And what would you think would be the result of that happening?
Would there be much happiness and merrymaking? Nah. I'm leaning more towards law suits.
All it is doing is driving people away from physX.
Ok, ironic people claiming effects were missing from physx games and only using physx get called out for proof, but claiming bl2 physx problems are from (insert random likely false reason), claiming physx is gaining market acceptance, etc. doesn't require proof.
I'm done with this but if people want to make stuff up just don't use double standards.
For instance, take a look at the following screenshot. In that particular scene, our framerate was at low 50s in both our dual-core and our quad-core systems (max details with high view distance and low PhysX). Our GPU usage was at 60%, suggesting that there was a CPU limitation. We did not witness any significant improvements when we lowered all of our settings (but kept view distance at High). When we lowered the view distance to medium, our framerate increased to mid-50s and when we lowered it to low, our framerate jumped to 80s. This clearly shows that the view distance setting is the most stressful one. It also shows that the game does not take advantage of four cores, as there werent any significant differences between a dual-core and a quad-core. Moreover, it also proves that there is an optimization issue here that will affect most PC systems, unless of course they can overcome those issues with their additional raw power. Ironically, it seems that the view distance is more of a CPU than a GPU setting, suggesting that Gearbox has offloaded the view distance setting to the CPU
Is it? There is more content and nVidia has clear discrete leadership!
1. No, it does not have a clear discrete leadership. And the vast majority of the titles it supposedly does have are CPU physics not GPU physics.
2. More content? Did you peer into an alternate dimension where nvidia did not implement the DRM or even openly supported physX as a secardy card and observe fewer titles with GPU physX implemented in such dimensions then in our own dimension?
DSOgaming did an in-depth performance analysis on BL2:
Click here
Bolded select sentences. Those comments reflect my experiences with BL2. The game is poorly optimized for modern CPUs, and PhysX highlights that issue because the CPU is taxed even more for the extra amount of draw calls.
On my 4.5ghz 3930K, one core is almost always at 100% utilization, and another bounces between 75 and 100%. The rest of them show very little activity, and I'm sure that's from the OS and not the game.
So PhysX isn't the problem. It's the game :hmm:
1.If having 65% market share isn't leadership I don't know what is.
1. show source that states 65% of video games are shipping with nVidia physX
2. CPU vs GPU physX, again, which you keep ignoring.
3. Still haven't clarified the alternate dimension observations
How is a source for your outlandish claim unimportant?1.Unimportant as you said they don't have the leadership.How many GE or TWIMTBP titles come out each year ? 25-30 at max.Most of them have some exclusive technology by either vendor.1. show source that states 65% of video games are shipping with nVidia physX1.If having 65% market share isn't leadership I don't know what is.
CPU PhysX not CPU physics.2.CPU is not fast enough for Physx period.I find it funny that after so many years people still compare CPU vs GPU computing.There are certain tasks which will always run better on gpu.
How is a source for your outlandish claim unimportant?
CPU PhysX not CPU physics.
PhysX, nvidia brand name, has FREE for any developer, libraries for performing non intensive calculations EXCLUSIVELY on the CPU. over 90% of the titles nvidia lists as using "PhysX" DO NOT actually support GPU accelerated PhysX at all, instead they run exclusively some simple and non intensive physX on the CPU
1.Which is outlandish that they don't have 65% discreet share? I find it extremely funny.Google it.
Sure it's not multithreaded.
Talk about avoiding the direct issue, when you enable physx in 4 player battles it drops to slideshow FPS at points, however with physx low it runs fine.
The issue is physx, and whether it's single or multithreaded has nothing to do with it since it's running on the GPU.
This is precisely what I mean, you are trying to say you have some basis to claim physx on BL2 is the bad game engine, but your proof doesn't demonstrate that at all.
I thought you were claiming nvidia physX has 65% of the market, aka, 65% of the video GAMES ship with GPU physX. An outlandish claim.
Apparently you just posted the % of discrete video cards in existence which were made by nvidia (with some rounding, first result on google says its 62%). And claimed this as the GPU PhysX adoption rate, even though it has absolutely nothing to do with it.
I did not notice that discrepancy, and for some reason in the 4 posts we each made since arguing the subject you have not seen fit to clarify.