Tressfx: A new frontier of realism in pc gaming

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DiogoDX

Senior member
Oct 11, 2012
757
336
136
I like physx. Have nice effects. My only problem with physx is the cut off graphics when the effect is off to increase the sense of improving image quality.

Mirros Edge:
PhysX ON: nice flags and plastic interation
PhysX OFF: no flags, no plastic

Batman AA:
PhysX ON: nice volumetric smoke
PhysX OFF: no smoke


If the Lara's hair were made by PhysX probably she would be bald with physx off.
 

Dravonic

Member
Feb 26, 2013
84
0
0
I'm pretty sure if that was the case they would put some standard hair on her...

In fact I have no doubt she already has standard hair for people with medium to lower end systems. And don't forget the consoles.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I'm pretty sure if that was the case they would put some standard hair on her...

In fact I have no doubt she already has standard hair for people with medium to lower end systems. And don't forget the consoles.

Yes, the before and after pictures show that earlier in the thread and on AMD's blog.
 

BoFox

Senior member
May 10, 2008
689
0
0
I like physx. Have nice effects. My only problem with physx is the cut off graphics when the effect is off to increase the sense of improving image quality.

Mirros Edge:
PhysX ON: nice flags and plastic interation
PhysX OFF: no flags, no plastic

Batman AA:
PhysX ON: nice volumetric smoke
PhysX OFF: no smoke


If the Lara's hair were made by PhysX probably she would be bald with physx off.
LOL, exactly! The Xbox360 version of Batman:AA still had fog in it, but for the PC version, turning PhysX to "off" removed fog altogether. That was not nice. Did the developer really want to do that for 40-50% of PC gamers?

There is only 1 PhysX.
There is no CPU-PhysX...no GPU-PhysX.
It's all PhysX.
I think you confuse performance caps implemented by the developer within the game itself.

Take dynamic fog.
It could be run via the CPU...at single digit framerates as the result.
So the developers cap certain features in the game CP/config, simply because the CPU dosn't have the performance to do it in any usefull ways and it would be a useless feature then.

It's all just PhysX...no difference in language/code/compiler.
No more PhysX. DirectCompute/OpenCL is going to do a better job. PhysX always sucked. It was always a bit laggy for me, with stuttering - especially in Mirror's Edge, Batman: AA, UT3, Warmonger, etc.. PhysX was never a real part of the gameplay, actually built into the game engine itself upon which gameplay operated. It was always an add-on visual gimmick, and nothing more. Oftentimes, the performance didn't match. The game could be running very very smoothly, then suddenly become a stuttering, jittery mess dropping to 20-ish fps (with the frame times making it feel much worse than that), and then once the billion pieces of broken glass shards magically disappear, the frame rate goes back to normal..
It was always poorly optimized code, often over-done to try to make it look impressive, but still hardly impressive. Ghostbusters with Infernal Velocity physics engine that made good use of 4 CPU cores had far-better looking physics than most hardware physx games out there at that time. Even Crysis and Far Cry 2 made all those Physx games seem unimpressive at large. Usually, such PhysX effects would be limited to only a very small handful of add-on "effects" here and there.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
LOL, exactly! The Xbox360 version of Batman:AA still had fog in it, but for the PC version, turning PhysX to "off" removed fog altogether. That was not nice. Did the developer really want to do that for 40-50% of PC gamers?


No more PhysX. DirectCompute/OpenCL is going to do a better job. PhysX always sucked. It was always a bit laggy for me, with stuttering - especially in Mirror's Edge, Batman: AA, UT3, Warmonger, etc.. PhysX was never a real part of the gameplay, actually built into the game engine itself upon which gameplay operated. It was always an add-on visual gimmick, and nothing more. Oftentimes, the performance didn't match. The game could be running very very smoothly, then suddenly become a stuttering, jittery mess dropping to 20-ish fps (with the frame times making it feel much worse than that), and then once the billion pieces of broken glass shards magically disappear, the frame rate goes back to normal..
It was always poorly optimized code, often over-done to try to make it look impressive, but still hardly impressive. Ghostbusters with Infernal Velocity physics engine that made good use of 4 CPU cores had far-better looking physics than most hardware physx games out there at that time. Even Crysis and Far Cry 2 made all those Physx games seem unimpressive at large. Usually, such PhysX effects would be limited to only a very small handful of add-on "effects" here and there.

I have heard people like you declare "PhysX is dead!" since 2006.
The "argument" ran out of steam YEARS ago...
 

BoFox

Senior member
May 10, 2008
689
0
0
imho,

Personally feel my views are irrelevant in the big picture that is the over-all market. Instead of the silly bickering, personal aspects, allow the market to decide. Pretty simple actually.

You don't like proprietary? Fine! Some don't!

It hasn't effected the market place negatively over-all; and because of proprietary, nVidia is growing and so is the PC platform.

Because of proprietary? Far from it. Sure, businesses try to squeeze what they can out of "proprietary", but proprietary does not always = success.

Sometimes, proprietary = lack of progress.

Problem is - competing businesses do not always like to cooperate with each other. Nvidia didn't really want to support tessellation until DX11, when they felt ready in that they had a clear advantage over the competitor. ATI had Truform, and NV had their own tessellation algorithm over 10 years ago, but both companies didn't want to collaborate in any way at all. Of course ATI/AMD didn't want to adopt NV's PhysX, nor did NV want to support PhsyX if there were a Radeon card running in the system, despite NV's own card running as a dedicated PhsyX PPU.

Fact is - other than very very few games (like the original Unreal Torunament), we didn't have tessellation (UT's Truform, which was in its infancy) for an entire decade. Of course, tessellation would've been nowhere near as nice as it was with DX11 GPUs, but at least there could have been very light to moderate tessellation applied in games all the way since 2001. It would have been very nice in severely polygonal games like Doom 3, where every single monster head looked like an octagonal stop sign. Especially the boobs! We had to put up with polygonal boobs for the entire decade, basically, when the GPUs were most certainly capable of smoothing them out to some degree, at least.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
The screenshots look good, but I haven't seen any videos of gameplay made available by AMD. Isn't physics inherently difficult to show in a screenshot?

I'm tempted to say that this new technique will add much more to "graphics" than PhysX, which mostly just allows for add-ons that did little in regard to immersion. But without video, it's impossible to draw any conclusions.
 

Itchrelief

Golden Member
Dec 20, 2005
1,398
0
71
Especially the boobs! We had to put up with polygonal boobs for the entire decade, basically, when the GPUs were most certainly capable of smoothing them out to some degree, at least.

:D

I think you've just found the must-have feature! Tessellation + physics... :sneaky:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yup...you need video to see stuff like that:
http://www.youtube.com/watch?v=WpW6kPBnyQ4

Oh crap...I posted a link to an actual game...some people are going to go nuts now ^^

You and some other posters seem to be either confused or not understanding that proprietary graphics/physics technology is not advancing the market. Every game that uses it feels like a new PhysX demo and nothing more. Physics effects via PhysX locks out 50-60% of the GPU market. For this reason, developers are reluctant to go to the next level and make a game where PhysX effects impact actual gameplay. This is why most games that use PhysX do a piss-poor job doing so or end up with poorly exaggerated/inaccurate versions of what we call "physics" in real life. In other words, proprietary PhysX will not be very prevalent in games unless NV has 80-90% market share. Trying to make realistic physics or DX11 effects using an open-standard compute language, etc. is HOW to make games. This is because the developers could use Compute shaders of the GPU to make games better looking and let NV and AMD focus on making faster hardware with more advanced capability for DirectCompute.

PhysX will continue to be a failure unless NV lets AMD cards use it. It's good to see that AMD is working closer with developers to make better looking games for everyone.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
You and some other posters seem to be either confused or not understanding that proprietary graphics/physics technology is not advancing the market. Every game that uses it feels like a new PhysX demo and nothing more. Physics effects via PhysX locks out 50-60% of the GPU market. For this reason, developers are reluctant to go to the next level. This is why most games that use PhysX do a piss-poor job doing so or end up with poorly exaggerated versions of what we call physics in real life. In other words, proprietary PhysX will not be very prevalent in games unless NV has 80-90% market share. Trying to make realistic physics or DX11 effects using an open-standard compute language, etc. is HOW to make games. This is because the developers could use Compute shaders of the GPU to make games better looking and let NV and AMD focus on making faster hardware with more advanced capability for DirectCompute.

PhysX will continue to be a failure unless NV lets AMD cards use it.

Same old broken record eh?
http://www.pcgameshardware.com/aid,...Nvidia-responds-to-AMDs-attack-on-Physx/News/
PCGH: AMD claims that PhysX is proprietary. What's your reaction?
Nadeem Mohammed: PhysX is a complete Physics solution which runs on all major platforms like PS3, XBOX360, Wii, PC with Intel or AMD CPU, and on the PC with GeForce cards; it even runs on iPhone. It's available for use by any developer for inclusion in games for any platform - all free of license fees. There's nothing restrictive or proprietary about that. We have been told that some AMD spokespeople talk about PhysX being like 3DFX's GLIDE API - that's even more of inaccuracy analogy, games written for GLIDE simple would not run on any system without a 3DFX card, whereas PhysX runs on more platforms than any other Physics Solution out there, and comes with tools and plug-ins, like APEX which help developers create content which can actually scale between different solutions. So please try out some of the latest PC titles - and give feedback to the developers on what things gamers really want in games - let's keep on pushing the industry to make killer games together!​


Combined with:

http://www.bit-tech.net/news/hardware/2008/12/11/amd-exec-says-physx-will-die/1

Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Product Group, has said that PhysX will die if it remains a closed and proprietary standard.

"There is no plan for closed and proprietary standards like PhysX," said Cheng. "As we have emphasised with our support for OpenCL and DX11, closed and proprietary standards will die."

Sounbds more like AMD trying to say no to PhysX...and then blame NVIDIA because it dosn't run on AMD GPU's...cart befor ehorse
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Same old broken record eh?

No, my post is how someone who is objective would approach PhysX vs. DirectCompute. It's pretty funny how you are so brand brainwashed that you can't even see how a particular piece of tech locked to a particular brand is alienating a certain portion of consumers. Now try to connect the dots between that and how developers and publishers think...

So the last 7 years of games with physics is nothing...but AMD making hair is "the shizzle"...gotcha...

GCN is the first true from the ground-up DirectCompute architecture from AMD. It's slightly more than a year old. At this stage, games that are starting to use DirectCompute to accelerate graphical effects / advance graphical effects are popping up a lot quicker than PhysX games. NV and AMD can continue improving their GPU architectures in the next 5-10 years to be able to handle even more graphically demanding effects via DirectCompute. Compute shaders, geometry shaders (tessellation) can make games better for everyone. I realize you can't get this concept through your head for 7 years now. :hmm:

Sounbds more like AMD trying to say no to PhysX...and then blame NVIDIA because it dosn't run on AMD GPU's...cart befor ehorse

That entire comment is BS. You just linked some quote that states PhysX is an open standard and then in the same reply you acknowledge it doesn't run on AMD GPUs at all. That means PhysX is proprietary 100%. BL2's .ini hack is more proof than ever that NV blocks PhysX unless you have an NV GPU. Why can't PC gamers with AMD cards run PhysX in games like BL2, Batman AC, etc.?

Whatever the case is, the argument is just stupid to begin with. With more powerful DirectCompute GPUs, the entire gaming community benefits. With PhysX, you have to have an NV card. The more developers push for open standards like tessellation and compute shaders to make games more realistic, the better. PhysX hasn't moved at all in the last 7 years. Same unrealistic looking effects.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
. ATI had Truform, and NV had their own tessellation algorithm over 10 years ago, but both companies didn't want to collaborate in any way at all.

AMD was evangelizing N patches and nVidia was evangelizing RT patches! The irony is both were collaborating with Havok and HavokFX for GPU physics and Intel swooped in!

Of course ATI/AMD didn't want to adopt NV's PhysX
.

Amd had quiet conversations about PhysX with nVidia.

because we've actually had quiet conversations with them

http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1


My point was proprietary has translated into growth for nVidia:

Jen Hsun Huang said:
It's been quite a journey and quite an investment. But now CUDA is impacting every aspect of our business. In Tesla, you could see the progress there we just talked about. CUDA is also making it possible for workstations and design applications to design and simulate at the same time. CUDA is also making it possible for our PC gaming GeForce to be able to use simulation for special effects and to materials and dynamics in the virtual world. And so CUDA is -- has proven to be a real lift for our entire GPU business. And I would go so far as to say that, because of CUDA, we have kept the GPU continuing to grow into more and more applications, and as a result, continue to grow the business.

http://seekingalpha.com/article/993...arnings-call-transcript?page=4&p=qanda&l=last
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
No, my post is how someone who is objective would approach PhysX vs. DirectCompute.

This is a very good point! However, PhysX is a middleware and DirectCompute is an API.

Ideally, open standards are key but it may depend on how robust and flexible the API is -- and the same point may be made for the tools for the Physic middleware.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
No, my post is how someone who is objective would approach PhysX vs. DirectCompute. It's pretty funny how you are so brand brainwashed that you can't even see how a particular piece of tech locked to a particular brand is alienating a certain portion of consumers. Now try to connect the dots between that and how developers and publishers think...

Nice fallacy...but has nothing to with the facts.
You sound like an linux-whiner now..."Open source!!!!"


GCN is the first true from the ground-up DirectCompute architecture from NV. It's slightly more than a 1 year old. At this stage, games that are starting to use DirectCompute to accelerate graphical effects / advance graphical effects are popping up a lot quicker than PhysX games. NV and AMD can continue improving their GPU architectures in the next 5-10 years to be able to handle even more graphically demanding effects via DirectCompute. Compute shaders, geometry shaders (tessellation) can make games better for everyone. I realize you can't get this concept through your head for 7 years now. :hmm:

I have enjoy better-than-the-CPU-can-deliver physics since 2006, drop the lies.
You keep waiting, I'm sure in time AMD will throw you a bone...or not.



That entire comment is BS. You just linked some quote that states PhysX is an open standard and then in the same reply you acknowledge it doesn't run on AMD GPUs at all. That means PhysX is proprietary 100%. BL2's .ini hack is more proof than ever that NV blocks PhysX unless you have an NV GPU. Why can't PC gamers with AMD cards run PhysX in games like BL2, Batman AC, etc.?

Don't call BS and then use lies and BS as an counter"...I'm not impressed.

Does PhysX run an AMD x86 CPU's? Yes/No.
If you...you just debunked your "100%"
And you really should ask AMD why their cards cannot run CUDA....not me ^^


Whatever the case is, the argument is just stupid to begin with. With more powerful DirectCompute GPUs, the entire gaming community benefits. With PhysX, you have to have an NV card. The more developers push for open standards like tessellation and compute shaders to make games more realistic, the better. PhysX hasn't moved at all in the last 7 years. Same unrealistic looking effects.

You say PjhysX hasn't moved at all in 7 years...and you expect me to take you and your lies seriously?

Go ahead...post more irrelavnat FUD, lies and your typical fallacies...the red team will marvel...all the rest will see right through it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
My point was proprietary has translated into growth for nVidia:

What would you rather have: (1) Next generation graphical effects available to everyone or (2) if NV and AMD each spent millions of dollars advancing proprietary graphical/physics tech in games that would not work on the competing brand's products?

Just answer 1 or 2. Don't say let the market decide.

The only way anyone thinks PhysX is great is if they exclusively use NV cards and plan on doing so forever and/or if they are a shareholder of NV and/or if they are an employee of NV or its affiliated partners/AIBs.

DirectCompute effects in games is open to any developer on any GPU capable of running them. So now we have geometry shaders, pixel shaders, and compute shaders. All of these are standard.

Don't call BS and then use lies and BS as an counter"...I'm not impressed.

Thinking so highly of yourself that others have to impress you now? You really are a very weird individual. Your entire argument falls flat on its face because a certain fraction of the market cannot use PhysX. Everyone can run DirectCompute on NV or AMD. Any advancement in visuals that is open for developers to exploit is welcome. I know you can't think outside the box since you'll just to be using NV branded GPUs forever, which is why you can't get this simple point how proprietary PhysX alienates the gaming market and prevents the developers from truly using it in a more advanced manner.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
You and some other posters seem to be either confused or not understanding that proprietary graphics/physics technology is not advancing the market. Every game that uses it feels like a new PhysX demo and nothing more.

And the hair simulation in Tomb Raider is more than: " like a new [DirectCompute] demo and nothing more"?

Physics effects via PhysX locks out 50-60% of the GPU market. For this reason, developers are reluctant to go to the next level and make a game where PhysX effects impact actual gameplay. This is why most games that use PhysX do a piss-poor job doing so or end up with poorly exaggerated/inaccurate versions of what we call "physics" in real life.

And the hair simulation in Tomb Raider "impact actual gameplay"?

In other words, proprietary PhysX will not be very prevalent in games unless NV has 80-90% market share. Trying to make realistic physics or DX11 effects using an open-standard compute language, etc. is HOW to make games. This is because the developers could use Compute shaders of the GPU to make games better looking and let NV and AMD focus on making faster hardware with more advanced capability for DirectCompute.

So, using DirectCompute to lockout 66% of the market is okay but using Cuda to lockout 33% of the market is wrong?

PhysX will continue to be a failure unless NV lets AMD cards use it. It's good to see that AMD is working closer with developers to make better looking games for everyone.

"For everyone"? So you know that nVidia users can play Tomb Raider with the hair simulation without having a 60% performance lost like in Dirt:Showdown with Forward+?

BTW: Everyone can play the GPU-PhysX enhanced games because of their x86 CPU. So i don't get your opinion...
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
@RS

I would rather have a company invest, invent, innovate and try to improve gaming experiences now instead of waiting for open standards to be forged to mature.

While AMD owners have waited for standards to be forged, mature, tool sets to be created and mature, GeForce owners had the choice to use, not use, like, dislike, and mock PhysX for years.

It's not really about proprietary vs open but more-so how far will a company risk and invest to improve gaming experiences? Proprietary allows innovation.

The market does decide to me.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I actually think the hair feature looks rather nifty. I don't know exactly how well it will play out in the game, and I'm curious to see what performance penalties it will cause (if any) on my GTX 680. The one nice aspect about focusing on things like the physics of hair is that we always see the main character. What does it matter that I get water blob physics when I play Borderlands 2 if the only time when I see that is when there's an open pipe, which is fairly rare, or I open an outhouse, which is fairly disgusting.

If I'm being frank, I really don't care much for the PhysX implementation in Borderlands 2. Some of it is nifty and unobtrusive, but the cloth/flags in the game are just downright annoying. I see a lot of fabrics that exist on boxes or next to buildings, which tend to serve as cover while you're trying to take out enemies. The problem that you end up running into is that these textiles end up blocking your view, and you cannot see the enemy without running out into the open. If you know where the enemy is, you can still shoot through the cloth, but I don't find that to be the case often. I've come across this problem a few times, and it's made me rather tempted to turn PhysX off as it doesn't really provide much beyond pretty explosions and elemental effects.

However, I usually just suck it up and leave it on. :p
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
What would you rather have: (1) Next generation graphical effects available to everyone or (2) if NV and AMD each spent millions of dollars advancing proprietary graphical/physics tech in games that would not work on the competing brand's products?

Just answer 1 or 2. Don't say let the market decide.

The only way anyone thinks PhysX is great is if they exclusively use NV cards and plan on doing so forever and/or if they are a shareholder of NV and/or if they are an employee of NV or its affiliated partners/AIBs.

DirectCompute effects in games is open to any developer on any GPU capable of running them. So now we have geometry shaders, pixel shaders, and compute shaders. All of these are standard.



Thinking so highly of yourself that others have to impress you now? You really are a very weird individual. Your entire argument falls flat on its face because a certain fraction of the market cannot use PhysX. Everyone can run DirectCompute on NV or AMD. Any advancement in visuals that is open for developers to exploit is welcome. I know you can't think outside the box since you'll just to be using NV branded GPUs forever, which is why you can't get this simple point how proprietary PhysX alienates the gaming market and prevents the developers from truly using it in a more advanced manner.

I agree with little you are preaching about here. The good of all and all that. What about AMD 's other cards in it's lineup. The 7870? 7850 etc.
You don't like physx, that's your opinion, prerogative. This direct compute mantra, is pure b/s, it is similar to brainwashing sounding that you claim, physx supporters are under. Except, physx has triple AAA games in it's repertoire , Batman s, Mafia II and others, and you have, the lighting in Dirt: Showdown, which I've never heard anyone put any praise on, except you, and that's after you originally put down this game!