• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Tressfx: A new frontier of realism in pc gaming

Page 34 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
So you need PhysX just for collision detection? And NV makes an announcement just for this?

Talk about damage control.
 
No proof, it's highly unlikely since Havok from Intel was already demoed running on the PS4 GPU.

Havok runs on everything. What are you talking about anyway? Havok runs on the 360, PS3, smartphones, everything. Just like collision detection physx does.

If you want to imagine a scenario where 1) Sony pays royalties to nvidia and 2) AMD supports CUDA....with both of these being required for GPU nvidia PhysX, then well, I don't know what to say. Neither of those would ever happen. Even the most optimistic person would admit that's a 100% impossibility.
 
w1zard sent a inquiry to Nvidia, their response:
Currently, most features in the PhysX SDK run only on the CPU, regardless of platforms. Certain features, such as particle systems and clothing, can be accelerated on a CUDA-capable GPU. We will continue to study the feasibility of alternate implementations, and welcome any feedback from the developer community regarding the value of GPU-accelerated PhysX on all architectures.
 
Understand that the PS3 and xbox 360 also supported PhysX. Furthermore, AMD also supports this type of physx. Surprised you say?

Read the press release carefully, with the understanding that there are two different types of physX. One requires dedicated hardware, the other type does not. One is merely a physics engine with collision detection, and the other does CUDA physx GPU effects.

PS4, just like the xbox 360 and the PS3 does not have dedicated hardware and will only support collision detection physX.

I do know this, but if you read the whole statement, you'll see he implies PhysX could come to AMD hardware just AMD are too cheap to pay for it.

Unless I have interpreted it incorrectly?
 
So, about TressFX...

It looks nice, I would say a step in the right direction... However there are points where it does look like it's just to be "showy" and other times it doesn't seem quite right. Like when she is upside down... Why is her bangs still in place?

I for one would like to see how it progresses.
 
I do know this, but if you read the whole statement, you'll see he implies PhysX could come to AMD hardware just AMD are too cheap to pay for it.

Unless I have interpreted it incorrectly?

AMD has a history of excepting others to develop for them, OpenCL vs CUDA is a prime example of this. Nvidia created a programming language, and hardware to support it while AMD created hardware and expected people to create software to support it.

Perhaps cheap was the wrong word. I believe AMD could have PhysX support on their GPU's if they were willing to pay Nvidia for their software, their development, and support of it. However historically AMD does not do that, perhaps things have changed, we're seeing AMD go headlong into dev relations these days!

The general theme on forums is that Nvidia is an evil cooperation who hates gamers because they won't give AMD a free ride on their money sink.
 
AMD has a history of excepting others to develop for them, OpenCL vs CUDA is a prime example of this. Nvidia created a programming language, and hardware to support it while AMD created hardware and expected people to create software to support it.

Perhaps cheap was the wrong word. I believe AMD could have PhysX support on their GPU's if they were willing to pay Nvidia for their software, their development, and support of it. However historically AMD does not do that, perhaps things have changed, we're seeing AMD go headlong into dev relations these days!

The general theme on forums is that Nvidia is an evil cooperation who hates gamers because they won't give AMD a free ride on their money sink.

Okay! Well, I can see why you feel that could be the case.
 
AMD has a history of excepting others to develop for them, OpenCL vs CUDA is a prime example of this. Nvidia created a programming language, and hardware to support it while AMD created hardware and expected people to create software to support it.

Perhaps cheap was the wrong word. I believe AMD could have PhysX support on their GPU's if they were willing to pay Nvidia for their software, their development, and support of it. However historically AMD does not do that, perhaps things have changed, we're seeing AMD go headlong into dev relations these days!

The general theme on forums is that Nvidia is an evil cooperation who hates gamers because they won't give AMD a free ride on their money sink.

May I ask how in the world AMD developing a counter-PhysX thingie would have been any better for PC gamers? Just more fragmentation.
 
The general theme on forums is that Nvidia is an evil cooperation who hates gamers because they won't give AMD a free ride on their money sink.
Nope. Reality is, AMD is smart enough not to let themselves be held hostage by PhysX. If they license it, there is nothing to stop Nvidia from making sure it performs better on their own hardware. So AMD would be helping PhysX adoption, but in game reviews and benches they would be shown to be slower. This would be about the most idiotic thing AMD could possibly do. AMD is the one that is not willing to give Nvidia a free ride on their coin, and rightly so it's just smart business I would expect Nvidia to do the very same thing if the roles were reversed. It's a small miracle that AMD did not lock TressFX out on Nvidia cards actually.

As for TressFX (you know, the subject of this thread) I am really impressed with what they've done in Tomb Raider. It is definitely not perfect, but damn at times it looks incredible. The game itself is brilliant, best I've played in a long time. The atmosphere, the graphics, and especially Lara you actually start feeling sorry for what she has to go through. The flowing hair just adds to that emotional attachment, I hope they continue to develop this type of tech and we see it in more games.
 
They were expecting Havok to do it for them, that was until Intel bought it.

Nope. Reality is, AMD is smart enough not to let themselves be held hostage by PhysX. If they license it, there is nothing to stop Nvidia from making sure it performs better on their own hardware. So AMD would be helping PhysX adoption, but in game reviews and benches they would be shown to be slower. This would be about the most idiotic thing AMD could possibly do. AMD is the one that is not willing to give Nvidia a free ride on their coin, and rightly so it's just smart business I would expect Nvidia to do the very same thing if the roles were reversed.

Right, doom and gloom, Nvidia is evil and AMD is the savior of the common gamer man... You've made it quite clear you have no desire to be objective and wear your support of AMD on your sleeve.
 
Last edited:
And now AMD has taken the initiative to help put accelerated physics effects into a game, and it runs on ALL hardware. Yet some are pissing and moaning and screaming, "but PhysX" 😎
 
AMD could create their own physics middleware like Havok and PhysX and use the DirectCompute or OpenCL API.

They have a relationship with the Bullet Physic middleware with the creator, Erwin Coumans, working for AMD.

It's all about a resource point and where to place their precious resources to me.
 
Last edited:
Nope. Reality is, AMD is smart enough not to let themselves be held hostage by PhysX. If they license it, there is nothing to stop Nvidia from making sure it performs better on their own hardware. So AMD would be helping PhysX adoption, but in game reviews and benches they would be shown to be slower. This would be about the most idiotic thing AMD could possibly do. AMD is the one that is not willing to give Nvidia a free ride on their coin, and rightly so it's just smart business I would expect Nvidia to do the very same thing if the roles were reversed. It's a small miracle that AMD did not lock TressFX out on Nvidia cards actually.

As for TressFX (you know, the subject of this thread) I am really impressed with what they've done in Tomb Raider. It is definitely not perfect, but damn at times it looks incredible. The game itself is brilliant, best I've played in a long time. The atmosphere, the graphics, and especially Lara you actually start feeling sorry for what she has to go through. The flowing hair just adds to that emotional attachment, I hope they continue to develop this type of tech and we see it in more games.

I agree! Really enjoying this game.
 
AMD has taken the initiative to help put accelerated physics

GTFO Newton 😉


tressvsgel.jpg

Futures so bright you gotta wear shades, right? 😎

 
Good one.

Objectification is something fanatics lack.

First of all - source:

http://www.tbreak.com/features/tomb-raiders-tressfx-performance-amd-vs-nvidia

Second of all:

(...) the GTX 680's abysmal performance really took me by surprise in TressFX where the HD 7970 outperformed GTX 680 by 34%! (...)

20 vs 52 FPS is obviously a 34% difference... :awe:

I am quite sure the difference is not as dramatic (granted, I don't use SSAA, just FXAA) - the person must have done something wrong (and can't really count).

Just wait for new drivers, it's what I'm doing. They will close the gap.
 
TressFX is great, it just need more optimization and I bet we can see in future games made by Square Enix/Crystal Dynamics and I like how AMD is going like a train with helping them improve performance and make possible to add more unique effects...

TressFX is a huge leap in realism when its about hair 😀
 
First of all - source:

http://www.tbreak.com/features/tomb-raiders-tressfx-performance-amd-vs-nvidia

Second of all:



20 vs 52 FPS is obviously a 34% difference... :awe:

I am quite sure the difference is not as dramatic (granted, I don't use SSAA, just FXAA) - the person must have done something wrong (and can't really count).

Just wait for new drivers, it's what I'm doing. They will close the gap.
Yeah.....This guy is obviously doing something very, very wrong. Just to check as I had been using FXAA and overclocked cards previously, I left my Lightnings at default clocks and and switched to 4x SSAA with everything else set to highest and TressFX on to duplicate his settings and this is my result. A far cry from the 20fps joke posted on that site, unless of course you think SLI gets over 400% scaling!

Of course I expect this post of real world user data will be ignored by certain people who only see what they want to see so they can continue to champion the cause for whichever company they proclaim loyalty to. I said earlier the benchmarks lead me to believe it's certain system configurations experiencing problems, a driver update and/or game patch should clear this up, as my particular system isn't lacking speed. I have been experiencing some occasional crashing though.

680LightningSLI.jpg
 
Last edited:
Actually that wasn't my point 😉

Go check how things worked out for Dragon Age 2. Abysmal performance on nVidia cards initially... then new drivers and... hey hey... no more problems!! Amazing, I know.

Sit tight. Wait for new drivers. Got my point?
 
Back
Top