Tressfx: A new frontier of realism in pc gaming

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
It's also set to use Physx, stop the misinformation:
http://physxinfo.com/news/10531/nvi...tner-for-physics-middleware-on-playstation-4/

http://www.scei.co.jp/ps4_tm/index.html

Havok couldn't PhysX on x86 CPU...now it will kill PhysX on GPU's...

LOL
LOL
&
LOL

I never said that PhsyX could never be on the PS4. I clearly stated that on about 10 different occasions now that Nvidia needs to open the doors on PhysX and make it run on all hardware or face extinction. That article from a highly suspect(as in some random garbage blog with no fact checking) looking website seems to infer that Nvidia may indeed do just that and save PhysX from becoming completely unused.

I have a feeling I could make a post stating that Nvidia is the greatest and that AMD sucks and you will still come in and reply, "No, You're wrong NVIDIA IS THE GREATEST AND AMD SUCKS, STOP SPREADING MISINFORMATION."
 

BoFox

Senior member
May 10, 2008
689
0
0
Honestly, the amount of retardation in this thread is mind boggling. Half the nvidia cheerleaders don't even realize that direct compute is part of DX11 API's and not something owned by AMD. They are blaming AMD for gimping Nvidia's performance with DirectCompute. What a bunch of smart individuals.

Don't worry about it - everybody knows very well. Anybody who "jest" around with you are just fooling around. :biggrin:
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
PhysX Low - basic CPU physics, similar for PC and consoles. Interesting note: physically simulated clothing and hair for Alice are not part of hardware PhysX content, and are not affected by PhysX settings (moreover, it is not even using PhysX engine for simulation).
See more at: http://physxinfo.com/news/5883/gpu-physx-in-alice-madness-returns/#sthash.goWJ6mAu.dpuf

The hair in Alice moves better than in alot of other games and that's what really counts, how its done i couldn't care less, i have yet to see Tressfx in motion.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Any game with PhysX looks hollow and incomplete without it on imo.

I like PhysX more than Havok, however I'd rather see Havok than nothing.

Hair is low on my list of effects, more interested in destructible environments and world interaction.

The hair looks pixilated.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
after reading about this, watching the funny Conan O'Brien "review" of the game on the Xbox and noticing how greatly improved that hair could be, I'm quite curious to see some videos of "tressfx"...

as for the whole GPU PhysX thing, it would be great if nvidia could release a direct compute or opencl version, even if it was significantly slower (as long as it's decent enough, a lot faster than running on the CPU)
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Any game with PhysX looks hollow and incomplete without it on imo.

I like PhysX more than Havok, however I'd rather see Havok than nothing.

Hair is low on my list of effects, more interested in destructible environments and world interaction.

The hair looks pixilated.

Physics is about physical behavioural interaction weight and moment, you don't need Physics to put more gfx and effects on screen, that's called graphics.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Physics is about physical behavioural interaction weight and moment, you don't need Physics to put more gfx and effects on screen, that's called graphics.

I'm aware, however static graphics just don't do it for me. A stream in the world is just an image, unless I can use some gravity warp field and watch it spin around and splash out.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
I'm aware, however static graphics just don't do it for me. A stream in the world is just an image, unless I can use some gravity warp field and watch it spin around and splash out.

Yes but besides Cellfactor PhysX has not impressed me at all in games, the simulations in game don't look realistic to me, water always move like viscous oil in games [cryostasis] and is probably why its used more for gooey fluids and is used more as a gfx level setting switch, i have seen some good realistic looking water PhysX demos but not in game, likely because that cant be bothered to tweak the parameters to make it so.
The FOG in BAA moves like BA has fans on his boots when he works through it...ect.

I can understand people would rather have that FOG than none at all but then you don't need PhysX for the FOG to exist in that case, the PhysX should be to make that FOG that does exist move more realistic.
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
See more at: http://physxinfo.com/news/5883/gpu-physx-in-alice-madness-returns/#sthash.goWJ6mAu.dpuf

The hair in Alice moves better than in alot of other games and that's what really counts, how its done i couldn't care less, i have yet to see Tressfx in motion.

Well, I watched the video.. hair was done nearly as nicely in 3DMark03, which was what, about 8 years before Alice with PhysX that slowed down even 2x GTX 560 Ti to a ~20fps stutter.. being so poorly optimized. I read on in that link, and the optimizations show much better performance with nearly as good physics.

Moral of the story: We didn't need proprietary NV GPU PhysX for swaying hair about 10 years ago.

Superseding moral of the story: If AMD and NV collaborated on an open-source physics API, it would have boosted both AMD and NV's success a good deal, by making games more attractive and complex, encouraging gamers to spend a little less on the less important CPU and a bit more on the more important GPU doing more physics. It would have also boosted PC gaming further. AMD and NV could have had greater sales thanks to this. Intel did the smart thing by snatching Havok, and trying to keep the CPU as important as possible. There was not as much constructiveness until it becomes standardized within DX, the standard API for PC games.

Same goes for Tessellation. It could have been done with this flying head monster:

doom-3-wallpaper-.jpg


Gosh, why did that flying head have to look like an octagonal stop sign that I see everyday on the road?!?

A few years after UT used Truform, Doom 3 could have had a way better use of tessellation to get rid of the stupid polygons. Even mild tessellation costing only 20% of GPU performance back then would have taken care of it real nicely for its time. So nicely that in fact, ID Software would've made some more revenue from Doom3 copies (selling more, plus people buying them earlier, at higher prices). The PC gaming industry would have more of an uproar with its advantage over console versions. Carmack stabbed PC users in the back, giving consoles the priority when it came to how many polygons would be used for monsters, etc.. $.

We had to suffer polygonal boobs for a decade because of selfish corporate reasons.

Sometimes, collaborating together brings greater $ for both sides, than otherwise.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Here's what I'm talking about tool. This is a example of your depth, are you waiting for someone to also call you out for being a liar? So you can put on a grandstand?

I assume you think I was talking about you. Why would you be under that assumption?
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Well, I watched the video.. hair was done nearly as nicely in 3DMark03, which was what, about 8 years before Alice with PhysX that slowed down even 2x GTX 560 Ti to a ~20fps stutter.. being so poorly optimized. I read on in that link, and the optimizations show much better performance with nearly as good physics.

Moral of the story: We didn't need proprietary NV GPU PhysX for swaying hair about 10 years ago.

Superseding moral of the story: If AMD and NV collaborated on an open-source physics API, it would have boosted both AMD and NV's success a good deal, by making games more attractive and complex, encouraging gamers to spend a little less on the less important CPU and a bit more on the more important GPU doing more physics. It would have also boosted PC gaming further. AMD and NV could have had greater sales thanks to this. Intel did the smart thing by snatching Havok, and trying to keep the CPU as important as possible. There was not as much constructiveness until it becomes standardized within DX, the standard API for PC games.

Same goes for Tessellation. It could have been done with this flying head monster:

doom-3-wallpaper-.jpg


Gosh, why did that flying head have to look like an octagonal stop sign that I see everyday on the road?!?

A few years after UT used Truform, Doom 3 could have had a way better use of tessellation to get rid of the stupid polygons. Even mild tessellation costing only 20% of GPU performance back then would have taken care of it real nicely for its time. So nicely that in fact, ID Software would've made some more revenue from Doom3 copies (selling more, plus people buying them earlier, at higher prices). The PC gaming industry would have more of an uproar with its advantage over console versions. Carmack stabbed PC users in the back, giving consoles the priority when it came to how many polygons would be used for monsters, etc.. $.

We had to suffer polygonal boobs for a decade because of selfish corporate reasons.

Sometimes, collaborating together brings greater $ for both sides, than otherwise.

Indeed.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.

Do you mind if I quote you on this in my sig?

P.S. 3dmark03 didn't have much in the way of moving hair. Those were short strands of clumpy hair that barely moved.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Physics is about physical behavioural interaction weight and moment, you don't need Physics to put more gfx and effects on screen, that's called graphics.

Everything you see is physics...or low level physcial simulation...it called a fact.
 

Ibra

Member
Oct 17, 2012
184
0
0
Superseding moral of the story: If AMD and NV collaborated on an open-source physics API, it would have boosted both AMD and NV's success a good deal, by making games more attractive and complex, encouraging gamers to spend a little less on the less important CPU and a bit more on the more important GPU doing more physics. It would have also boosted PC gaming further. AMD and NV could have had greater sales thanks to this. Intel did the smart thing by snatching Havok, and trying to keep the CPU as important as possible. There was not as much constructiveness until it becomes standardized within DX, the standard API for PC games.

Nvidia: Clothing, Destruction, Particles.
AMD : Hair, Shadows, Light effects.
But AMD ego won't allow it.
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
How is this new from the PhysX hair we had years ago?

It sounds like it will run on all DX11 cards, so hopefully it should actually see widespread adoption in games. Making these things common across all brands should improve the quality of gaming visuals for us all- even those who use nVidia cards.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
It sounds like it will run on all DX11 cards, so hopefully it should actually see widespread adoption in games. Making these things common across all brands should improve the quality of gaming visuals for us all- even those who use nVidia cards.

You are in for a surpise:
Finally, hair styles are simulated by gradually pulling the strands back towards their original shape after they have moved in response to an external force. Graphics cards featuring the Graphics Core Next architecture, like select AMD Radeon™ HD 7000 Series, are particularly well-equipped to handle these types of tasks, with their combination of fast on-chip shared memory and massive processing throughput on the order of trillions of operations per second.

Wanna be it only run at acceptable levels on high ned HD 7000 Cards?

Runs for all...LOL
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
You are in for a surpise:


Wanna be it only run at acceptable levels on high ned HD 7000 Cards?

Runs for all...LOL

You're making performance estimates based on marketing spiel? Because that only ever ends well. ;)

Wait for benchmarks. If it winds up crippling performance on NVidia cards then yeah, that's less useful. But we can't predict that yet.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
You're making performance estimates based on marketing spiel? Because that only ever ends well. ;)

Wait for benchmarks. If it winds up crippling performance on NVidia cards then yeah, that's less useful. But we can't predict that yet.

No, I actual have more than a weak notion about the required computional requirements...but sudden physical computaion is free now...LOL...this is getting too funny.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Man TressFX has Lonbjerg worked into a frenzy.


No, when I was all excited about hair was in 2010, never things excite me now, but welcome to 3 years ago

Anything more irrelvant too add over old tech being heralded as new...just because it runs on AMD GPU's?

It seems the red posters have a issue with me(due to my arguments)...since they avoid my arguments and only deliver ad hominem...must really hurt.
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
No, I actual have more than a weak notion about the required computional requirements...but sudden physical computaion is free now...LOL...this is getting too funny.

As you pointed out, it was running on nVidia cards 3 years ago. I doubt the costs will be very ridiculous.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
No, when I was all excited about hair was in 2010, never things excite me now, but welcome to 3 years ago

Anything more irrelvant too add over old tech being heralded as new...just because it runs on AMD GPU's?

It seems the red posters have a issue with me(due to my arguments)...since they avoid my arguments and only deliver ad hominem...must really hurt.

Don't fret, I'm sure she'll run on your geforce just fine
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
As you pointed out, it was running on nVidia cards 3 years ago. I doubt the costs will be very ridiculous.

It was running as tech demo 3 years ago yes....but let wait for the benches to tell the story...and the "foot in mouth syndrome" to strike the red team again when it comes to GPGPU physics ;)