Tressfx: A new frontier of realism in pc gaming

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
What a false premise.
I am having fun with tthe fact that because suddenly something will run on AMD GPU's...some people stance towards "fluf-effects" change...try an keep up okay ^^
You don't seem like you're having fun, in fact you come across as bitter and miserable. :sneaky:

And nothing wrong with eye candy, it adds a lot to a game if done correctly.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
What a false premise.
I am having fun with tthe fact that because suddenly something will run on AMD GPU's...some people stance towards "fluf-effects" change...try an keep up okay ^^

the "something" that didn't run on AMD wasn't AMD fault was it? The owner decided to make it proprietary. This is open to everyone, even Nvidia.
My stance never changed, i used Nvidia's feature when i was running my GTX 280 and i liked it but as others have said it wasn't enough, it needs to be adopted by the industry in a larger scale. Your sad bias and immaturity really shows and it only draws people away from your beloved "green religion"
 

GlacierFreeze

Golden Member
May 23, 2005
1,125
1
0
He regularly derails and thread craps. Just report him. No need in arguing with him.

He's part of the reason there's a 60% vote in favor of more moderation.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
What a false premise.
I am having fun with tthe fact that because suddenly something will run on AMD GPU's...some people stance towards "fluf-effects" change...try an keep up okay ^^

Again, it's something that was released that can be used by the owners of ANY DX11 video card that meets the minimum specs. It isn't restricted to the hardware of any one company. Therefore it benefits the entire gaming community.

Try and keep it real, okay?
 
Last edited:

Granseth

Senior member
May 6, 2009
258
0
71
I think better hair in games are a good thing, at least if it doesn't tax the GPU to much.

Lets hope they continues so that clothes becomes a second layer on the characters, not just some texture and triangles stuck on the character. I really hate when armor stretches when the characters move. My guess is that that can be accomplished with compute shaders as well.

And lets not confuse this with physics, they are not the same, but can coexist. And please just let the nvidia vs AMD war rest. Most people are not on any team, we are just consumers and hate these useless wars that are ruining threads.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I think dynamic cloth is all about physics.

I still think one of the most clever implementation of physics (the best use of the available computational power) too add to the immersion of a game is Batman:
http://www.youtube.com/watch?v=thCWFXVCH3A

But we are far from fully dynamic clothed, fully fysiological body/ragdoll simulated characthers in a dynamic, modifyable/destructable world simulated in a game-engine.

I wish we were there allready though! :biggrin:
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0

Granseth

Senior member
May 6, 2009
258
0
71
I think dynamic cloth is all about physics.
It could be about physics, but I don't need it to be physics, I just need it to look realistic, and if the computing power to deceive me is less than a more accurate version it wouldn't matter to me. It would be more important to have true physics where it affects gameplay, and not where its just eye candy.
 

Granseth

Senior member
May 6, 2009
258
0
71
I still think one of the most clever implementation of physics (the best use of the available computational power) too add to the immersion of a game is Batman:
http://www.youtube.com/watch?v=thCWFXVCH3A

But we are far from fully dynamic clothed, fully fysiological body/ragdoll simulated characthers in a dynamic, modifyable/destructable world simulated in a game-engine.

I wish we were there allready though! :biggrin:
The cape behaves the same with physX off. And it isn't colliding with anyone but Batman. But it looks mostly good, except it is affected by it being a computer game and behaving unnatural when the player is turning the character unnaturally fast.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The cape behaves the same with physX off. And it isn't colliding with anyone but Batman. But it looks mostly good, except it is affected by it being a computer game and behaving unnatural when the player is turning the character unnaturally fast.

PhysX is never off...:sneaky:

Sorry, this bit seems to be a paradox?

...and behaving unnatural when the player is turning the character unnaturally fast.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Over-all, I still enjoy threads like this because the discussion goes beyond just frame-rate and tries to create awareness or discussions for improved gaming experiences. To see an ability or feature and go, "Wow! That really looks neat!" My favorite topics!
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I still think one of the most clever implementation of physics (the best use of the available computational power) too add to the immersion of a game is Batman:
http://www.youtube.com/watch?v=thCWFXVCH3A

But we are far from fully dynamic clothed, fully fysiological body/ragdoll simulated characthers in a dynamic, modifyable/destructable world simulated in a game-engine.

I wish we were there allready though! :biggrin:

Red faction has far more impressive implementation of physics.
 

DiogoDX

Senior member
Oct 11, 2012
757
336
136
Again, I can find no evidence that the hair in Alice: Madness Returns is made using PhysX. This three page blog post mentions nothing about hair. The only reason I could think of them not claiming credit for it is because they had nothing to do with it.

http://www.geforce.com/whats-new/articles/physx-in-alice-madness-returns

PhysX Low - basic CPU physics, similar for PC and consoles. Interesting note: physically simulated clothing and hair for Alice are not part of hardware PhysX content, and are not affected by PhysX settings (moreover, it is not even using PhysX engine for simulation).
http://physxinfo.com/news/5883/gpu-physx-in-alice-madness-returns/#
 

Granseth

Senior member
May 6, 2009
258
0
71
PhysX is never off...:sneaky:

Sorry, this bit seems to be a paradox?

OK, but as DigioDX posted for Alice, it's the same for Batman. It uses nVidias APEX clothing solver ( can be used for hair too). It's embedded into physX, but it's a standalone solution for cloths.
http://physxinfo.com/wiki/APEX_Clothing

And I think it's much the same as Tressfx other than it runs on CPU (or GPU, but think it's only nVidia GPU) instead of computer shaders. ( http://youtu.be/UY_kTMFpQ4E )
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
OK, but as DigioDX posted for Alice, it's the same for Batman. It uses nVidias APEX clothing solver ( can be used for hair too). It's embedded into physX, but it's a standalone solution for cloths.
http://physxinfo.com/wiki/APEX_Clothing

And I think it's much the same as Tressfx other than it runs on CPU (or GPU, but think it's only nVidia GPU) instead of computer shaders. ( http://youtu.be/UY_kTMFpQ4E )

So dynamic hair in games is nothing new...gotcha ^^
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
So dynamic hair in games is nothing new...gotcha ^^
It really doesn't matter who does what first if nothing spectacular is done with it. That's a pseudo-argument of deflection at best.

Again, I think it will be interesting to see what is done with this technology and how much penetrance it gains.
 

Granseth

Senior member
May 6, 2009
258
0
71
I'm really looking forward to TressFX and hoping that it brings some evolution to hair simulation (and hopfully other things. It looks like nVidias apex is going forward as well looking at the Unreal tech demos, and since the next gen consoles are around the corner it and AMDs solutions might see the day of light in allot of games.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I think a lot of people are caught up in the whole "It's just hair" thing. We all know it's been done before. The thing to get excited about is the fact that it's using Direct Compute and it's available to all GPU owners.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
What has me excited is AMD's investment in developer relations!

Even more exciting is that AMD getting it's act together with developers will put far more pressure on Nvidia to do the same. So yes it's just hair, but if it's a great solution then they can move on the next thing to add to games. Who knows what it will be next?

Using compute/gpu power to run AI and get some smart enemies in games? Physics that are more than just eye candy running on the GPU that have an actual effect on gameplay?

I'm looking forward to what comes next.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
It really doesn't matter who does what first if nothing spectacular is done with it. That's a pseudo-argument of deflection at best.

Again, I think it will be interesting to see what is done with this technology and how much penetrance it gains.

Indeed, there was plenty of excitement for PhysX when it was first brought to light but the excitement has died through lack of progress for how long its been out for, so its someone else's turn.

It should of moved forwards after CellFactor with real physics instead of backwards.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Even more exciting is that AMD getting it's act together with developers will put far more pressure on Nvidia to do the same. So yes it's just hair, but if it's a great solution then they can move on the next thing to add to games. Who knows what it will be next?

Using compute/gpu power to run AI and get some smart enemies in games? Physics that are more than just eye candy running on the GPU that have an actual effect on gameplay?

I'm looking forward to what comes next.

Here's hoping.