BallaTheFeared
Diamond Member
No proof, it's highly unlikely since Havok from Intel was already demoed running on the PS4 GPU.
No proof, it's highly unlikely since Havok from Intel was already demoed running on the PS4 GPU.
Currently, most features in the PhysX SDK run only on the CPU, regardless of platforms. Certain features, such as particle systems and clothing, can be accelerated on a CUDA-capable GPU. We will continue to study the feasibility of alternate implementations, and welcome any feedback from the developer community regarding the value of GPU-accelerated PhysX on all architectures.
Havok runs on everything. What are you talking about anyway? Havok runs on the 360, PS3, smartphones, everything. Just like collision detection physx does.
Understand that the PS3 and xbox 360 also supported PhysX. Furthermore, AMD also supports this type of physx. Surprised you say?
Read the press release carefully, with the understanding that there are two different types of physX. One requires dedicated hardware, the other type does not. One is merely a physics engine with collision detection, and the other does CUDA physx GPU effects.
PS4, just like the xbox 360 and the PS3 does not have dedicated hardware and will only support collision detection physX.
So, about TressFX...
I do know this, but if you read the whole statement, you'll see he implies PhysX could come to AMD hardware just AMD are too cheap to pay for it.
Unless I have interpreted it incorrectly?
AMD has a history of excepting others to develop for them, OpenCL vs CUDA is a prime example of this. Nvidia created a programming language, and hardware to support it while AMD created hardware and expected people to create software to support it.
Perhaps cheap was the wrong word. I believe AMD could have PhysX support on their GPU's if they were willing to pay Nvidia for their software, their development, and support of it. However historically AMD does not do that, perhaps things have changed, we're seeing AMD go headlong into dev relations these days!
The general theme on forums is that Nvidia is an evil cooperation who hates gamers because they won't give AMD a free ride on their money sink.
AMD has a history of excepting others to develop for them, OpenCL vs CUDA is a prime example of this. Nvidia created a programming language, and hardware to support it while AMD created hardware and expected people to create software to support it.
Perhaps cheap was the wrong word. I believe AMD could have PhysX support on their GPU's if they were willing to pay Nvidia for their software, their development, and support of it. However historically AMD does not do that, perhaps things have changed, we're seeing AMD go headlong into dev relations these days!
The general theme on forums is that Nvidia is an evil cooperation who hates gamers because they won't give AMD a free ride on their money sink.
Nope. Reality is, AMD is smart enough not to let themselves be held hostage by PhysX. If they license it, there is nothing to stop Nvidia from making sure it performs better on their own hardware. So AMD would be helping PhysX adoption, but in game reviews and benches they would be shown to be slower. This would be about the most idiotic thing AMD could possibly do. AMD is the one that is not willing to give Nvidia a free ride on their coin, and rightly so it's just smart business I would expect Nvidia to do the very same thing if the roles were reversed. It's a small miracle that AMD did not lock TressFX out on Nvidia cards actually.The general theme on forums is that Nvidia is an evil cooperation who hates gamers because they won't give AMD a free ride on their money sink.
Nope. Reality is, AMD is smart enough not to let themselves be held hostage by PhysX. If they license it, there is nothing to stop Nvidia from making sure it performs better on their own hardware. So AMD would be helping PhysX adoption, but in game reviews and benches they would be shown to be slower. This would be about the most idiotic thing AMD could possibly do. AMD is the one that is not willing to give Nvidia a free ride on their coin, and rightly so it's just smart business I would expect Nvidia to do the very same thing if the roles were reversed.
Nope. Reality is, AMD is smart enough not to let themselves be held hostage by PhysX. If they license it, there is nothing to stop Nvidia from making sure it performs better on their own hardware. So AMD would be helping PhysX adoption, but in game reviews and benches they would be shown to be slower. This would be about the most idiotic thing AMD could possibly do. AMD is the one that is not willing to give Nvidia a free ride on their coin, and rightly so it's just smart business I would expect Nvidia to do the very same thing if the roles were reversed. It's a small miracle that AMD did not lock TressFX out on Nvidia cards actually.
As for TressFX (you know, the subject of this thread) I am really impressed with what they've done in Tomb Raider. It is definitely not perfect, but damn at times it looks incredible. The game itself is brilliant, best I've played in a long time. The atmosphere, the graphics, and especially Lara you actually start feeling sorry for what she has to go through. The flowing hair just adds to that emotional attachment, I hope they continue to develop this type of tech and we see it in more games.
AMD has taken the initiative to help put accelerated physics
GTFO Newton 😉
![]()
Good one.
Objectification is something fanatics lack.
(...) the GTX 680's abysmal performance really took me by surprise in TressFX where the HD 7970 outperformed GTX 680 by 34%! (...)
20 vs 52 FPS is obviously a 34% difference... :awe:
Yeah.....This guy is obviously doing something very, very wrong. Just to check as I had been using FXAA and overclocked cards previously, I left my Lightnings at default clocks and and switched to 4x SSAA with everything else set to highest and TressFX on to duplicate his settings and this is my result. A far cry from the 20fps joke posted on that site, unless of course you think SLI gets over 400% scaling!First of all - source:
http://www.tbreak.com/features/tomb-raiders-tressfx-performance-amd-vs-nvidia
Second of all:
20 vs 52 FPS is obviously a 34% difference... :awe:
I am quite sure the difference is not as dramatic (granted, I don't use SSAA, just FXAA) - the person must have done something wrong (and can't really count).
Just wait for new drivers, it's what I'm doing. They will close the gap.
Actually that wasn't my point 😉