Tressfx: A new frontier of realism in pc gaming

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ibra

Member
Oct 17, 2012
184
0
0
Even more exciting is that AMD getting it's act together with developers will put far more pressure on Nvidia to do the same. So yes it's just hair, but if it's a great solution then they can move on the next thing to add to games. Who knows what it will be next?

Using compute/gpu power to run AI and get some smart enemies in games? Physics that are more than just eye candy running on the GPU that have an actual effect on gameplay?

I'm looking forward to what comes next.

http://m.techradar.com/news/gaming/...ng/gaming-ai-to-move-to-graphics-cards-505554

But it won't come from AMD since they are trying to kill everything what is GPU related. Maybe Nvidia will add AI into Physx SDK and AMD fanboys will cry it's another gimmick.

Indeed, there was plenty of excitement for PhysX when it first brought to light but the excitement has died through lack of progress for how long its been out for, so its someone else's turn.

And AMD introduced one program for hair. Lack of progress?

While being based on PhysX SDK, MassFX has replaced old Havok Reactor physics engine
http://physxinfo.com/wiki/MassFX

I wonder if you can use TressFX as plugin for 3ds Max or Maya. If not = pure fail.
 

Dravonic

Member
Feb 26, 2013
84
0
0
But it won't come from AMD since they are trying to kill everything what is GPU related.

That's from 2007... but let's have a look at it either way:

GPU Physics dead for now [...] GPU Physics may be delayed till DirectX 11
Sounds about right. AMD is finally bringing in GPU Physics with DirectCompute in DirectX 11.

What were you trying to say again?
 

Dravonic

Member
Feb 26, 2013
84
0
0
You lost me. So Farcry 2 has physics and Farcry 3 doesn't. That's relevant why? If you mean developers are increasingly half-assing things they shouldn't, I completely agree.

And why should I be crying because The Witcher 3 is using PhysX?
 

BoFox

Senior member
May 10, 2008
689
0
0
Do you mind if I quote you on this in my sig?

P.S. 3dmark03 didn't have much in the way of moving hair. Those were short strands of clumpy hair that barely moved.
LOL, no problem!

Grstanford wanted me to tell you that the video he uploaded on youtube was completely retarded in quality, and that he uploaded it on: https://************/dl/196385281/66aa181/TrollsLairHigher.mp4.html

I'd rather just run 3DMark03 than watch a low-rez video anyway. Sure, 10 years of GPU advances makes it a lot more powerful, but the "short strands of clumpy hair" weren't too bad! It was pretty good, actually, man!!!
 

BoFox

Senior member
May 10, 2008
689
0
0
Aren't breads facial pubes? That's what the girls tell me anyways :\

ARGH, that's what my wife tells me too! At least my red goatee matches her hair color! :awe:

PhysX Low - basic CPU physics, similar for PC and consoles. Interesting note: physically simulated clothing and hair for Alice are not part of hardware PhysX content, and are not affected by PhysX settings (moreover, it is not even using PhysX engine for simulation).
Ah, the impressions fooled most of us, thinking that it was GPU PhysX that did the hair.. when it was NOT the case!! :ninja:
If it were PhysX, then most cards would have stuttered much worse whenever Alice moved around in the game (which is pretty much all the time)! LOL!!! I kid!

GPU PhysX almost has "microstuttering" of its own - more like hitching/jittering on single-GPU cards.
 
Last edited:

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Hmm I wonder if this feature will make other games look odd to me. I was watching a Bioshock Infinite video, and it only took a minute or two of looking at Elizabeth to fixate on that pinecone-blob-esque pony tail.

I am forever ruined. :(
 

BoFox

Senior member
May 10, 2008
689
0
0
Since when? :hmm:
Since PhysX kept on bring my high-end NV cards to their knees in the past, in Mirror's Edge, Batman:AA, UT3, Warmonger, etc..! Just over-blown PhsyX calculations begging for a dedicated PhysX card. ^_^

BTW, I wonder if ANYBODY out there ever tried to measure microstutter with a dedicated phsyx card? That would be interesting..
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
nVidia is utterly stupid and I am saying this as an nNvidia fan, not a hater.

They should have made a 1x or 4x PCIe card, with a super duper dedicated physics chip, instead of that gpgpu crap. At 28nm, priced at 150, they could make marvels.

It's been years since they bought Ageia and nothing good came out of it. They never even provided support for the hardware requiring levels of cellfactor. I bet that is because the geforces would crap out.

One more chip before the final result would induce another level of latency alright, but physx produces ridiculous results and performance hit as it is anyway.

The dedicated card would be available for all PC users, running AMD or Nvidia graphics cards so it would be win win for Nvidia. I don't know what these guys are thinking some times, I swear to God.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Since PhysX kept on bring my high-end NV cards to their knees in the past, in Mirror's Edge, Batman:AA, UT3, Warmonger, etc..! Just over-blown PhsyX calculations begging for a dedicated PhysX card. ^_^

BTW, I wonder if ANYBODY out there ever tried to measure microstutter with a dedicated phsyx card? That would be interesting..

You are about 7 years late, Google for GRAW PP driver update...

This is interesting, Industrial Light & Magic...praising CUDA over OpenGL...this is what AMD has to fight...the NVIDIA eco-system...and hair in a single game...won't cut it ^^
http://www.youtube.com/watch?v=PQef_6gio14&feature=share&list=PL55C1A52A917B2DDF
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
nVidia is utterly stupid and I am saying this as an nNvidia fan, not a hater.

They should have made a 1x or 4x PCIe card, with a super duper dedicated physics chip, instead of that gpgpu crap. At 28nm, priced at 150, they could make marvels.

It's been years since they bought Ageia and nothing good came out of it. They never even provided support for the hardware requiring levels of cellfactor. I bet that is because the geforces would crap out.

One more chip before the final result would induce another level of latency alright, but physx produces ridiculous results and performance hit as it is anyway.

The dedicated card would be available for all PC users, running AMD or Nvidia graphics cards so it would be win win for Nvidia. I don't know what these guys are thinking some times, I swear to God.

External PPUs like the Aeiga adds too much latency.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
External PPUs like the Aeiga adds too much latency.

The P1 was a PCI card. The PCI bus was running at 33Mhz. PCIe is running at 100Mhz. I bet latency has gone down.

Having seen what the magnet gun does in software, in Red Faction Armageddon, I bet a piece of dedicated hardware could do better, within a 16.66ms window, so we could have 60fps as well.

Having reached 2013, I personally find it unacceptable that we still cannot demolish everything in our games, with reasonable granularity that is.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Sorry but this topic is specifically about what AMD/NV is giving to gamers, this is not movie makers forum or topic.

Listen, just because you are hurt in your hind over PhysX...dosn't mean you can ignore the bigger Picture.

QUADRO, TESLA, CUDA, APEX, PhysX...it all connnected....if you cannot see this I am sorry for you.

The people at ILM likes CUDA...not openCL....not DirecteCompute...the pro's like CUDA = CUDA (and stuff running on CUDA as APEX, PhysX ect) won't go anywhere...as the choice is made by developers, not forumposters ;)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Man, three days later and Lonbjerg is still raging over this AMD feature...

Hardly anyone cares about gpu physx, stop derailing the thread. Just wait for the single game per year that uses it to come out in 2013 and let loose in a forum thread about said game with all this pent-up angst.
 
Last edited:

parvadomus

Senior member
Dec 11, 2012
685
14
81
Listen, just because you are hurt in your hind over PhysX...dosn't mean you can ignore the bigger Picture.

QUADRO, TESLA, CUDA, APEX, PhysX...it all connnected....if you cannot see this I am sorry for you.

The people at ILM likes CUDA...not openCL....not DirecteCompute...the pro's like CUDA = CUDA (and stuff running on CUDA as APEX, PhysX ect) won't go anywhere...as the choice is made by developers, not forumposters ;)

That's what happens nowdays, but its slowly changing.
I give CUDA and all this propietary crap 3 more years (specially physx). This and other stuff will die or be ported to opencl, C++ AMP etc..
The same always happens to this kind of stuff when an open standard emerges. Remember Glide?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
That's what happens nowdays, but its slowly changing.
I give CUDA and all this propietary crap 3 more years (specially physx). This and other stuff will die or be ported to opencl, C++ AMP etc..
The same always happens to this kind of stuff when an open standard emerges. Remember Glide?

You mean like OpenGL vs DirectX?
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
You mean like OpenGL vs DirectX?

No, he means vs. Glide. You're being intentionally daft. There's an obvious difference between a proprietary standard for an OS with near 100% share that is ubiquitous in the market place, and a proprietary standard for a hardware vendor that is behind two other companies in market share.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

It's really not about a proprietary or open point but a resource benefit point to me.

Proprietary can offer innovative choice and abilities when there are no standards, or if standards are limited or not as mature.

Ideally, of course, standards are welcome but if a company desires to risk and invest resources on creating a new sector for technology -- the market will decide.

If a company desires to go beyond standards and invest resources -- the market will decide.

I believe nVidia can't afford to wait for others to innovate -- because if they don't innovate -- they may perish -- amazing amount of pressure.

Without the proprietary Cuda -- nVidia would be in a world of hurt.