Tressfx: A new frontier of realism in pc gaming

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
What would you rather have: (1) Next generation graphical effects available to everyone or (2) if NV and AMD each spent millions of dollars advancing proprietary graphical/physics tech in games that would not work on the competing brand's products?

Just answer 1 or 2. Don't say let the market decide.

The only way anyone thinks PhysX is great is if they exclusively use NV cards and plan on doing so forever and/or if they are a shareholder of NV and/or if they are an employee of NV or its affiliated partners/AIBs.

DirectCompute effects in games is open to any developer on any GPU capable of running them. So now we have geometry shaders, pixel shaders, and compute shaders. All of these are standard.



Thinking so highly of yourself that others have to impress you now? You really are a very weird individual. Your entire argument falls flat on its face because a certain fraction of the market cannot use PhysX. Everyone can run DirectCompute on NV or AMD. Any advancement in visuals that is open for developers to exploit is welcome. I know you can't think outside the box since you are just forever going to be using NV branded GPUs which is why you can't get this simple point how proprietary PhysX hurts the gaming market.

Let look at your false premise.
physx runs on

  • X86 CPU's (The lot)
  • Cell CPU (Sony)
  • PowerPC CPU (IBM)
  • ARM CPU (Android, iOS)
  • PPU
It can also benefit from the GPGPU power of NVIDIA GPU's by running via CUDA (post G80) so:
  • NVIDIA GPU
PhysX however dosn't run under DirectCompute or OpenCL so it cannot utilize the GPGPU power in AMD GPU/APU's, as they do not run CUDA code.

  • AMD GPU
Your are good at math, why don't you calculated the proportions between the supported hardware...and the unsupoorted hardware...and call it "properitary" with a straight face again.

But funny...now only GPU's matter in physics?
What happend to the CPU is "good enough"?

I love when my predictions come true ^^
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Your are good at math, why don't you calculated the proportions between the supported hardware...and the unsupoorted hardware...and call it "properitary" with a straight face again.

I haven't read all of the posts in this little spat, but I think you're really just arguing over semantics at this point, and honestly... it's really annoying. I'm probably going to be sorry for interjecting, but I believe that no one is trying to state that PhysX is completely incapable of running on a wide variety of hardware. However, they are trying to state that the performance penalty incurred for producing some of the fancy PhysX effects on anything but a nVidia graphics card is significantly higher to the point that it is not worth using.

I mentioned PhysX in Borderlands 2 earlier. So, how about you just give that a try using your CPU instead of a nVidia graphics card, and let me know how that works out for ya! ;)
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91
Let look at your false premise.
physx runs on

  • X86 CPU's (The lot)
  • Cell CPU (Sony)
  • PowerPC CPU (IBM)
  • ARM CPU (Android, iOS)
  • PPU
It can also benefit from the GPGPU power of NVIDIA GPU's by running via CUDA (post G80) so:
  • NVIDIA GPU
PhysX however dosn't run under DirectCompute or OpenCL so it cannot utilize the GPGPU power in AMD GPU/APU's, as they do not run CUDA code.

  • AMD GPU
Your are good at math, why don't you calculated the proportions between the supported hardware...and the unsupoorted hardware...and call it "properitary" with a straight face again.

Just because it runs on multiple different devices, doesn't mean it's not proprietary. PhysX is definitely still proprietary, which is why I suspect AMD doesn't like it. They would rather support other open-source alternatives.

But funny...now only GPU's matter in physics?
What happend to the CPU is "good enough"?

The CPU would be good enough, if Nvidia actually bothered to implement it properly:

http://www.tomshardware.com/news/phyx-ageia-x87-sse-physics,10826.html

An excellent investigation by David Kanter at Real World Technologies found that Nvidia's PhysX software implementation for use by CPUs still uses x87 code, which has been deprecated by Intel in 2005 and now has been fully replaced by SSE. Intel supported SSE since 2000, and AMD implemented it in 2003.

The x87 code is slow, ugly, and remains supported on today's modern CPU solely for legacy reasons. In short, there is no technical reason for Nvidia to continue running PhysX on CPUs using such terrible software when moving to SSE would speed things considerably – unless that would make the GeForce GPGPU look less mighty compared to the CPU.

If PhysX wasn't so locked down, some community developer/modder would've probably fixed it a long time ago.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
PhysX however dosn't run under DirectCompute or OpenCL so it cannot utilize the GPGPU power in AMD GPU/APU's, as they do not run CUDA code.

Hopefully when compute or OpenCL mature or there is more competition nVidia may rethink and port to compute or OpenCL. I look forward to this day, if it happens.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Just because it runs on multiple different devices, doesn't mean it's not proprietary. PhysX is definitely still proprietary, which is why I suspect AMD doesn't like it. They would rather support other open-source alternatives.



The CPU would be good enough, if Nvidia actually bothered to implement it properly:

http://www.tomshardware.com/news/phyx-ageia-x87-sse-physics,10826.html

You might wanna Google that a bit more.....like the follow up part when PhysX went SSE

Or read here:
http://forums.anandtech.com/showthread.php?t=2088022

We are suddenly back in 2010 Again, in the attempt to use ingorance as an argument against PhysX?

Color me baffled...but amused.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Hopefully when compute or OpenCL mature or there is more competition nVidia may rethink and port to compute or OpenCL. I look forward to this day, if it happens.

NVIDIA couldn't really care less for DirectCompute or OpenCL.
Hell, does their driver even report fetures in OpenCL correctly yet?
And wasn't OpenCL borked on the Titan review drivers too?

That is how little they care and I can't blame them...with their CUDA eco-system.
 

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
Hopefully when compute or OpenCL mature or there is more competition nVidia may rethink and port to compute or OpenCL. I look forward to this day, if it happens.

I suspect they'll use the features of OpenCL, but will still push Cuda and physx. I mean if it runs best on their card they're going to want to keep it a selling point, no matter how inferior I suspect it may end up being.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
NVIDIA couldn't really care less for DirectCompute or OpenCL.
Hell, does their driver even report fetures in OpenCL correctly yet?
And wasn't OpenCL borked on the Titan review drivers too?

That is how little they care and I can't blame them...with their CUDA eco-system.

You really should let off the green juice and put down the pom poms, little mr angry forum poster. Get over the fact that plenty of people don't like PhysX, and :gasp:... some of them own Nvidia products, like me. You're mad that people are looking forward to this Tressfx hair thing, and don't care for PhysX... get over it brah :)
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
People are still quoting that "excellent investigation by David Kanter"
icon_lol0pu9e.gif


Guys you should update your PhysX rant list, because you know, it's been a while since PhysX has implemented both CPU multithreading and SSE

Oh and props to AMD, no doubt. Even if it's hair only, it's good enough for me. Especially coming from someone as poor as AMD.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
PhysX for everyone by AMD - The Way IT Meant To Be...
Not the NV bullshit: You had AMD card in your history, now you have to buy our hardware for 100 years, every year, and we will then let you use PhysX*.
*maybe
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
You really should let off the green juice and put down the pom poms, little mr angry forum poster. Get over the fact that plenty of people don't like PhysX, and :gasp:... some of them own Nvidia products, like me. You're mad that people are looking forward to this Tressfx hair thing, and don't care for PhysX... get over it brah :)

No I find it funny that people suddenly thinks dynamic hairs is tha shizzle...you take this awfully personal don't you?
(takes another match in WoT)
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
PhysX for everyone by AMD - The Way IT Meant To Be...
Not the NV bullshit: You had AMD card in your history, now you have to buy our hardware for 100 years, every year, and we will then let you use PhysX*.
*maybe

It's hardly an PhysX killer...it's hair only...Earth calling....hello?
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
No I find it funny that people suddenly thinks dynamic hairs is tha shizzle...you take this awfully personal don't you?
(takes another match in WoT)

If laughing at your obsessive defense of anything Nvidia is personal, then yes, I suppose so :D
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
No I find it funny that people suddenly thinks dynamic hairs is tha shizzle...you take this awfully personal don't you?
(takes another match in WoT)

I'm not sure why anyone would complain about it either, it's not something directly comparable to phsyx. On the other hand: it is free, it is available to all DX11 cards, and is not exclusive to any one brand. Heck, if you want to use it on HD4000 - you can. Physx is kinda sorta neat but very very few games use it, the tempo of physx titles has been roughly 1 per year. Again, physx and tressFX aren't directly comparable because physx is going for something different. My line of thinking is that if physx were open, that more than 1 title per 14 months would use it for GPU particle effects. Perhaps even consoles could use it. But as things stand, obviously consoles will never use physx GPU particle acceleration and that means most PC games won't either (due to multi platform game development). But hey, nvidia can do whatever they want to do - it's totally fine with me, I'm not complaining.

In short, both sides need to realize:

1) physx is not directly comparable
2) tressFX is free for everyone so why complain.

Carry on
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Where are RS with his minority percentages?
He is usuallly always there with math....not this time? ^^
 

Dravonic

Member
Feb 26, 2013
84
0
0
Let look at your false premise.
physx runs on

  • X86 CPU's (The lot)
  • Cell CPU (Sony)
  • PowerPC CPU (IBM)
  • ARM CPU (Android, iOS)
  • PPU
  • NVIDIA GPU
PhysX running on anything but CUDA is mostly indistinguishable from any other physics engine. I really don't get why you're bringing the "PhysX is a general purpose physics engine too" point up, since we're talking advanced hair physics, and you know damn well it's not happening in any of the other hardware.

The only problem with PhysX (and from here on I mean the CUDA kind), which very few seem to really get around here, is it's severely limited audience.

The very nature of PhysX makes the effort of developing for it rather pointless, and thus developers are rarely willing to do much. Why? Developing for PhysX costs time and money and only matters to a small portion of your user-base. Remember games have to run in multiple platforms. A current generation game simply cannot have any significant features dependent on PhysX, otherwise the experience is ruined for everyone on the PS3, x360 and a rather large portion of PCs.

Of course, from time to time a developer will go to the extra effort of actually doing something significant with it, but due to the visible lackluster adherence to PhysX, I think you've realized by now that for developers it's imperative that a technology is versatile. It doesn't matter if PhysX is the best thing since sliced bread, you're not going to spend your time and money on it if the returns just aren't there.

The bottom line is, because it's target audience is so small, it's rarely properly used and therefore is doomed to gimmicks.
 

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
Where are RS with his minority percentages?
He is usuallly always there with math....not this time? ^^

Probably realized that this thread went to garbage really quickly. Sad too since Tressfx is something worth talking about unlike physx.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
X86 CPU's (The lot)
Cell CPU (Sony)
PowerPC CPU (IBM)
ARM CPU (Android, iOS)
PPU
NVIDIA GPU

Longberg, you need to distinguish between physx as a general physics API as opposed to physx for GPU particle acceleration.

As a particle acceleration engine, only nvidia cards can use it. As a general physics API - which VERY few games use, the last one I remember is Dragon Age: Origins from 2009, any video card on the planet can use it. And I think that's what you're referring to here - even AMD cards can use physx as a general collision detection engine. You can simply try this out yourself if you have DA:O, it requires physx installed regardless of the host GPU brand.

Physx in that sense can be used by any and every GPU on the planet, including intel and AMD. Physx in this area is competing with Havoc and some other 3rd party physics/collision detection engines, and is actually used very rarely. It is completely different than the acceleration effects found in games such as Batman: AC, which would require dedicated hardware.

But for GPU particle acceleration? That's locked down and only usable on nvidia GPUs.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
It's hardly an PhysX killer...it's hair only...Earth calling....hello?

But there is nothing to kill...

Dravonic said:
A current generation game simply cannot have any significant features dependent on PhysX, otherwise the experience is ruined for everyone on the PS3, x360 and a rather large portion of PCs.
^
|
|
This!
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I'm not sure why anyone would complain about it either, it's not something directly comparable to phsyx. On the other hand: it is free, it is available to all DX11 cards, and is not exclusive to any one brand. Heck, if you want to use it on HD4000 - you can. Physx is kinda sorta neat but very very few games use it, the tempo of physx titles has been roughly 1 per year. Again, physx and tressFX aren't directly comparable because physx is going for something different. My line of thinking is that if physx were open, that more than 1 title per 14 months would use it for GPU particle effects. Perhaps even consoles could use it. But as things stand, obviously consoles will never use physx GPU particle acceleration and that means most PC games won't either (due to multi platform game development). But hey, nvidia can do whatever they want to do - it's totally fine with me, I'm not complaining.

In short, both sides need to realize:

1) physx is not directly comparable
2) tressFX is free for everyone so why complain.

Carry on

PhysX is free for everyone, too. If you want to play with it, activate it. It runs on every x86 cpu out there.