blackened23
Diamond Member
- Jul 26, 2011
- 8,548
- 2
- 0
They need to do something, NVIDIA is pushing on all fronts and have been for years...so nice to see then finally wake up ^^
Even more exciting is that AMD getting it's act together with developers will put far more pressure on Nvidia to do the same. So yes it's just hair, but if it's a great solution then they can move on the next thing to add to games. Who knows what it will be next?
Using compute/gpu power to run AI and get some smart enemies in games? Physics that are more than just eye candy running on the GPU that have an actual effect on gameplay?
I'm looking forward to what comes next.
Indeed, there was plenty of excitement for PhysX when it first brought to light but the excitement has died through lack of progress for how long its been out for, so its someone else's turn.
While being based on PhysX SDK, MassFX has replaced old Havok Reactor physics engine
http://physxinfo.com/wiki/MassFX
And AMD introduced one program for hair. Lack of progress?
But it won't come from AMD since they are trying to kill everything what is GPU related.
Sounds about right. AMD is finally bringing in GPU Physics with DirectCompute in DirectX 11.GPU Physics dead for now [...] GPU Physics may be delayed till DirectX 11
That's from 2007... but let's have a look at it either way:
Sounds about right. AMD is finally bringing in GPU Physics with DirectCompute in DirectX 11.
What were you trying to say again?
LOL, no problem!Do you mind if I quote you on this in my sig?
P.S. 3dmark03 didn't have much in the way of moving hair. Those were short strands of clumpy hair that barely moved.
Aren't breads facial pubes? That's what the girls tell me anyways :\
Ah, the impressions fooled most of us, thinking that it was GPU PhysX that did the hair.. when it was NOT the case!! :ninja:PhysX Low - basic CPU physics, similar for PC and consoles. Interesting note: physically simulated clothing and hair for Alice are not part of hardware PhysX content, and are not affected by PhysX settings (moreover, it is not even using PhysX engine for simulation).
GPU PhysX almost has "microstuttering" of its own - more like hitching/jittering on single-GPU cards.
Since PhysX kept on bring my high-end NV cards to their knees in the past, in Mirror's Edge, Batman:AA, UT3, Warmonger, etc..! Just over-blown PhsyX calculations begging for a dedicated PhysX card. ^_^Since when? :hmm:
Since PhysX kept on bring my high-end NV cards to their knees in the past, in Mirror's Edge, Batman:AA, UT3, Warmonger, etc..! Just over-blown PhsyX calculations begging for a dedicated PhysX card. ^_^
BTW, I wonder if ANYBODY out there ever tried to measure microstutter with a dedicated phsyx card? That would be interesting..
nVidia is utterly stupid and I am saying this as an nNvidia fan, not a hater.
They should have made a 1x or 4x PCIe card, with a super duper dedicated physics chip, instead of that gpgpu crap. At 28nm, priced at 150, they could make marvels.
It's been years since they bought Ageia and nothing good came out of it. They never even provided support for the hardware requiring levels of cellfactor. I bet that is because the geforces would crap out.
One more chip before the final result would induce another level of latency alright, but physx produces ridiculous results and performance hit as it is anyway.
The dedicated card would be available for all PC users, running AMD or Nvidia graphics cards so it would be win win for Nvidia. I don't know what these guys are thinking some times, I swear to God.
External PPUs like the Aeiga adds too much latency.
You are about 7 years late, Google for GRAW PP driver update...
This is interesting, Industrial Light & Magic...praising CUDA over OpenGL...this is what AMD has to fight...the NVIDIA eco-system...and hair in a single game...won't cut it ^^
http://www.youtube.com/watch?v=PQef_6gio14&feature=share&list=PL55C1A52A917B2DDF
External PPUs like the Aeiga adds too much latency.
Sorry but this topic is specifically about what AMD/NV is giving to gamers, this is not movie makers forum or topic.
Listen, just because you are hurt in your hind over PhysX...dosn't mean you can ignore the bigger Picture.
QUADRO, TESLA, CUDA, APEX, PhysX...it all connnected....if you cannot see this I am sorry for you.
The people at ILM likes CUDA...not openCL....not DirecteCompute...the pro's like CUDA = CUDA (and stuff running on CUDA as APEX, PhysX ect) won't go anywhere...as the choice is made by developers, not forumposters![]()
That's what happens nowdays, but its slowly changing.
I give CUDA and all this propietary crap 3 more years (specially physx). This and other stuff will die or be ported to opencl, C++ AMP etc..
The same always happens to this kind of stuff when an open standard emerges. Remember Glide?
You mean like OpenGL vs DirectX?