Why phsyx isn't in more games ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

HeXen

Diamond Member
Dec 13, 2009
7,835
37
91
How come i have option to use CPU for games that use Physix if Physix is GPU only as some seem to be implying?
Frankly i don't give a crap what type of physics engine is used so long as its used well.

The REAL question is...why don't we see more fully destructable environments in games???
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
The REAL question is...why don't we see more fully destructable environments in games???
because something like that would need a GPU to do it, and Intel wants Havok on CPUs
(they bought it up (when it was about to take off), and left it to die more or less).

and Nvidia want to do perpritary only physics (for that).
Devolopers dont want to do all that work for such a small % of users.

So basically Nvidia and Intels fault.


** if Bullet Physics takes off, and gets adopted by developers you might just see games like that in the future.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Nvidia Inspector. Control...funny. Nvidia lets you control much much more than AMD probably ever will.

Regarding PhysX:
It's a closed system so it doesn't pay off to program for it. I want GPU-accelerated physics as an open standard, and I want it now. Where is Bullet? AMD touted loudly they would support it and what happened? GPU-PhysX has no competition and that is just sad.

I own both 580 SLIs and 6970 xfire in 2 PC's and I can tell you that there are many, many games where override settings are outright ignored with nvidia drivers. I have to wonder if nvidia gets inflated benchmark scores in reviews because of this? Games I personally tested to find this recently were Dead Space 1, Dead Space 2, and Dead Island. Now AMD certainly has its faults and I prefer the 580's overall, but the one good thing about them is that it ALWAYS obeys override settings in CCC. If you enable SSAA in CCC it will work, you can use it in Dead Space and the game looks -fantastic-. Now on nvidia you can't even use SSAA -- and before you say it, SS transparency is a joke and is NOT the same thing is SSAA. Due to this AMD has better image quality generally speaking, because override settings ALWAYS WORK. With nvidia, its a crapshoot. Many games are forced to completely ignore "override" settings in 280+ drivers.

From what i've found nvidia drivers 280+ ignore "override" settings in roughly 65% of games. Of that 65%, you can fix 10% of them in nvidia inspector. You can plainly see in nvidia inspector that a great number of games are forced to ignore override settings - look for yourself. There is a setting flagged as "Treat override settings as use application preference". And even changing that in nvidia inspector does nothing.

On the flip side of the coin, nvidia has a better games profiling program , by far. And nvidia has been quick to push out SLI profiles for new games recently, such as bf3 and skyrim. I prefer playing bf3 on my 580 setup. It sure would be nice though if override settings worked without exception, though. And SSAA would be awfully nice to have.....Dead Island has so many jaggies it makes my eyes bleed. Speaking of which , Dead Island looks better on the 6970 crossfire because you can override AA settings. With the 580's, override settings are ignored.
 
Last edited:
Feb 19, 2009
10,457
10
76
There were games with destructible terrain and objects long before GPU accelerated physics started.

The problem is devs are lazy or publishers dont want to waste extra $$, they will do the absolute minimium required for their game to be marketable. IF NV wants Physx in games, they have to support it themselves, even then, to add extra features that are "fluff" but causing delays to the release of games.. few publishers would bother with it. Unless realistic physics is a core mechanic and affects gameplay, its not necessary.

ie. In BF3, metro conquest, choke at stairs, i threw a grenade at a bunch of enemy, just as i threw it, my noob team mate jumped right in front, the grenade hit his head and bounced backwards.. i ran back but the grenade bounced back further and exploded. we both die, he punishes me for a TK. you dont need fancy Physx to have physics in games. :D
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
This is a troll thread but the real answer is that Physx was something NV bought to add "exclusive features" to their products.

You pretty much got it for free, but since Physx itself sucks, it failed. CPUs are grossly overpowered anymore, NV has this pipe dream of slaying Intel when they're a total joke of a company in comparison.

Being serious though, Intel wipes their butt with Nvidia.
Gimmicky garbage.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
This is a troll thread but the real answer is that Physx was something NV bought to add "exclusive features" to their products.

You pretty much got it for free, but since Physx itself sucks, it failed. CPUs are grossly overpowered anymore, NV has this pipe dream of slaying Intel when they're a total joke of a company in comparison.

Being serious though, Intel wipes their butt with Nvidia.
Gimmicky garbage.

A troll you say?
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
The problem is devs are lazy or publishers dont want to waste extra $$, they will do the absolute minimium required for their game to be marketable.

This.

This is why Call of Duty 19 1/2 Super Pink Ops Space Invaders Collectors Edition and Madden 20384204 and One Third December Edition Special Release B with blue case.

Why bother innovating anything when you have today's generation of ADHD gamers who grew up on Halo and Fast and the Furious who will keep paying $60 over and over again for reskinned games every 3 weeks?

I remember when games had ENDING sequences longer than it takes to beat every military sim / FPS clone that came out this year.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I own both 580 SLIs and 6970 xfire in 2 PC's and I can tell you that there are many, many games where override settings are outright ignored with nvidia drivers. I have to wonder if nvidia gets inflated benchmark scores in reviews because of this? Games I personally tested to find this recently were Dead Space 1, Dead Space 2, and Dead Island. Now AMD certainly has its faults and I prefer the 580's overall, but the one good thing about them is that it ALWAYS obeys override settings in CCC. If you enable SSAA in CCC it will work, you can use it in Dead Space and the game looks -fantastic-. Now on nvidia you can't even use SSAA -- and before you say it, SS transparency is a joke and is NOT the same thing is SSAA. Due to this AMD has better image quality generally speaking, because override settings ALWAYS WORK. With nvidia, its a crapshoot. Many games are forced to completely ignore "override" settings in 280+ drivers.

From what i've found nvidia drivers 280+ ignore "override" settings in roughly 65% of games. Of that 65%, you can fix 10% of them in nvidia inspector. You can plainly see in nvidia inspector that a great number of games are forced to ignore override settings - look for yourself. There is a setting flagged as "Treat override settings as use application preference". And even changing that in nvidia inspector does nothing.

On the flip side of the coin, nvidia has a better games profiling program , by far. And nvidia has been quick to push out SLI profiles for new games recently, such as bf3 and skyrim. I prefer playing bf3 on my 580 setup. It sure would be nice though if override settings worked without exception, though. And SSAA would be awfully nice to have.....Dead Island has so many jaggies it makes my eyes bleed. Speaking of which , Dead Island looks better on the 6970 crossfire because you can override AA settings. With the 580's, override settings are ignored.

Long story short:
This is what AA-bits are for. Many deferred engines don't support AA out of the box or you cannot enhance it at all through the driver (DX10/11).

You can get SGSSAA in Dead Space 1/2 with a Geforce, no problems there. In Dead Island SGSSAA can be forced via AA-bits but you get graphic glitches, blur and heavy performance drops (also with AMDs SSAA as far as I know). There is a good reason why override is ignored, because it can make problems.

And just to clarify: Nvidias SGSSAA works in every API, including OpenGL (if you have MSAA, that is). So does TrSSAA (which you don't like :p). AMD's SSAA works only in DX9 and lower, its AAA only in DX9 and OpenGL.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Long story short:
This is what AA-bits are for. Many deferred engines don't support AA out of the box or you cannot enhance it at all through the driver (DX10/11).

You can get SGSSAA in Dead Space 1/2 with a Geforce, no problems there. In Dead Island SGSSAA can be forced via AA-bits but you get graphic glitches, blur and heavy performance drops (also with AMDs SSAA as far as I know). There is a good reason why override is ignored, because it can make problems.

And just to clarify: Nvidias SGSSAA works in every API, including OpenGL (if you have MSAA, that is). So does TrSSAA (which you don't like :p). AMD's SSAA works only in DX9 and lower, its AAA only in DX9 and OpenGL.

Dude, this is nvidias' excuse every time this topic is brought up about override being ignored. They say it time and time again, "but games will break!" Nvidias SSAA works in every API....thats great! But the option ISNT in the driver. You get ss transparency AA, which if you know the gory details, that is inferior to supersampled AA. Is SSAA reserved for a future release? Or does nvidia expect end users to rely on nvidia inspector so that it "maybe" works 20% of the time? Meanwhile in CCC you can simply click SSAA and bam you get SSAA.

I own both sets of hardware and am pretty neutral towards both companies. I have both 6970 xfire and 580 SLI, so i'm pretty neutral on the entire AMD vs nvidia fanboy nonsense. I call it like I see it. Override settings work in AMD every time. Override settings DONT WORK with nvidia CP over half the time. Lastly, look at your games list in nvidia inspector, you tell ME how many games are flagged to treat override settings as use application preference? If that isn't proof that nvidia disables override for just about everything, I don't know what is. Nvidia inspector works *sometimes*, but that is not common knowledge among users. Here's a good example...Dead Space 1 and 2. With 6970 xfire you just click on ssaa and it *works*. The game looks unreal, jaggies are completely gone and the game looks very, very good. Now with nvidia: Rummage through nvidia inspector, input the AA compability string (this is definitely NOT common knowledge), change the setting so that override isn't ignored, and then enable AA. Great! Except anything past 8X aa screws shadows up and you can't get SSAA. If you try to enable antyhing past 8x MSAA, you get a slide show with messed up shadows. Image quality is worse compared to SSAA on AMD cards -- which again, SSAA Works with no hassle.

This may sound harsh, but this isn't directed at you --- i'm just frustrated with nvidias drivers in this respect :( I prefer my 580s over 6970s, nvidia has really been on the ball with SLI drivers for bf3 and skyrim, and performance is always fantastic. Better than the 6970s in many games. But, I want override to work like it does with CCC. And this kind of ties in to your point that nvidia has more image quality options than AMD in their driver -- thats really not true at all. Because AMD settings work nearly always, while the same can't be said of nvidia control panel. I really hope nvidia does something to better address this - and not cower behind the "but it breaks games!" excuse.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Are you absolutely sure that AMDs SSAA does work properly in the games you mentioned? Meaning no blur, applying AA everywhere and not leaving edges out and not introducing any artifacts whatsoever? I would really like to see a side by side screenshot comparison (video is even better), just to be sure.

I read around the web and it seems that AMDs SSAA in Dead Space incurs a terrible performance hit (220fps vs. 10-20fps was written somewhere), causes white lines around objects and doesn't smooth everywhere.

You don't put something in the drivers that can cause problems, it's as simple as that. Because then you are liable if it doesn't work because the developer messed up and uses some technique that is incompatible with proper antialiasing, giving poor performance/glitches etc.

I think we are a bit off topic here - should I open a new thread? I would really like to discuss/investigate it further.
Btw, are you xoleras on the Nvidia forums?
 
Last edited:

HeXen

Diamond Member
Dec 13, 2009
7,835
37
91
well theres more than Havoc and Physix, Cryphysics i assume is GPU, but has full environemtn destruction. I'm sure there maybe others...what does BFBC2 use? and Farcry 2?

no i blame developers, they can use or create whatever they want.
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
Since Havok is CPU-accelerated it will be will out. PhysX is a sham of a disgrace of a terribleness.
 

greenhawk

Platinum Member
Feb 23, 2011
2,007
1
71
How come i have option to use CPU for games that use Physix if Physix is GPU only as some seem to be implying?

The REAL question is...why don't we see more fully destructable environments in games???

On the first, it is a fall back option done by developers as not everyone is expected to have a GTX260 or better video card. (IIRC that was reviewed at the point where GPU was better than CPU at the time, 1-2 years ago).

As to the second, I suspect it is more to do with network/internet connections and the desire for multiplayer games. In a single player game ability of having destructable environments is easy as everything is local no mater how much data is involoved. For the same to happen in a multiplayer game, the amount of extra information (ie: part #456789 has moved to <x,y,z> with orintation <x,y>) would kill most user's connection without trying. If LAN gaming was still popular it might be available now, but no one is interested in those games (and that is before developers place DRM into games to call home anyway so not allowing the game to run on a stand alone network).