Tressfx: A new frontier of realism in pc gaming

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106

his has roots in the core of Gaming Evolved, where we want to enable technology for all gamers".

Coming from a company which complained about Tessellation in HAWX2. :awe:

and not create proprietary features that lock out gamers that use our competitor's products"

So, lock out from a performance stand point is alright, but not from a API side? BTW: Everyone can use GPU-PhysX with a x86 processor... :hmm:
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Do people hate physx?

Definitely. The anti-nvidia crowd here generally hates physx. They complain about it in every capacity, even though a simple click of the mouse can completely disable GPU-physx effects.

Its a neat idea, but the problem is that nothing uses it; it has no bearing on competition. Honestly, it's a joke because there aren't enough games using it - For the past 3 years there has been 1 title per year using it.

AMD finally has their very first game that will have hardware accelerated physics effects. So while "nothing" uses physx by your definition, literally less than nothing uses AMD anything.

Anyway, it has been confirmed that TressFX is only reliant on good compute/GCN performance. It will work on all GPUs, although i'd imagine that it would perform not so well on kepler since it's obviously not good with compute.

Is physx not "compute" related? Doesn't Kepler perform fine/well within that scope? So shouldn't it be that if Kepler performs compute-based physx well, that it should theoretically perform tessfx well too then? If not, then it stands to reason that AMD purposefully coded the effects to hinder performance on Nvidia GPU's. Weren't people crying about how Nvidia supposedly did the same exact same things with TWIMTBP games and tessellation? Would it not be entirely hypocritical if the same people aren't again crying if tessfx turns out to perform like crap on Nvidia hardware?

Again, I think it's good AMD is leveraging their hardware and brand. I never once complained about physx or TWIMTBP and I applaud AMD for becoming the more aggressive hardware promoter as of recently. It makes me want to buy an AMD card in the future if they keep up their developer relations / bundles / game features.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Definitely. The anti-nvidia crowd here generally hates physx. They complain about it in every capacity, even though a simple click of the mouse can completely disable GPU-physx effects.



AMD finally has their very first game that will have hardware accelerated physics effects. So while "nothing" uses physx by your definition, literally less than nothing uses AMD anything.



Is physx not "compute" related? Doesn't Kepler perform fine/well within that scope? So shouldn't it be that if Kepler can perform compute-based physx well, that it should theoretically perform tessfx well too then? If not, then it stands to reason that AMD purposefully coded the effects to hinder performance on Nvidia GPU's. Weren't people crying about how Nvidia supposedly did the same exact same things with TWIMTBP games and tessellation? Would it not be entirely hypocritical if the same people aren't again crying if tessfx turns out to perform like crap on Nvidia hardware?

Again, I think it's good AMD is leveraging their hardware and brand. I never once complained about physx or TWIMTBP and I applaud AMD for becoming the more aggressive hardware promoter as of recently. It makes me want to buy an AMD card in the future if they keep up their developer relations / bundles / game features.
I don't hat PhysX or nVidia but I do hate the way they handle it. By now they should have had PhysX in every game coming out. Even if they would allow nVidia GPUs to do the PhysX while another card (AMD or nVidia) is the primary card in the rig it would make PhysX a lot more viable. Right now we can't have PhysX in meaningful way (to the point that it affects gameplay) because it's proprietary.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
The day everyone get their crap together and comes out with a standard that is implemented across companies and actually makes it into games I'm buying a card just to use for compute performance. I think all the demos etc look great, but it means nothing if it is rarely supported.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
So, lock out from a performance stand point is alright, but not from a API side? BTW: Everyone can use GPU-PhysX with a x86 processor... :hmm:

mmm yes? because at the end of the day, it can still be playable.... lower AA settings, use a lower resolution....do an overclock

with locked API, you can't do shit :mad:

based on screenshots, i wouln't mind using no AA at all, for a non-plastic hair
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't think performance will be an issue with either kepler or GCN AMD cards. In short, it doesn't appear that kepler will be locked out from a performance perspective based on posts from Eidos. Should be fine with both brands, we'll know more on the 5th though.

As an aside, Tomb Raider looks amazing, I definitely plan on getting it. Getting great reviews as well, not that it surprises me - Square Enix has had a great trackrecord over the past two years with excellent PC games.

Definitely a game worth getting!
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I don't hat PhysX or nVidia but I do hate the way they handle it. By now they should have had PhysX in every game coming out. Even if they would allow nVidia GPUs to do the PhysX while another card (AMD or nVidia) is the primary card in the rig it would make PhysX a lot more viable. Right now we can't have PhysX in meaningful way (to the point that it affects gameplay) because it's proprietary.

Agreed entirely. I think Nvidia has botched their handling of physx. They should allow it in mixed vendor situations without technical support. They could have labeled and sold cards with minimal amounts of vram and with firmware designed only to be used as a physx add-on solution. Instead they let it sit relatively idle, with only a few titles here in there.

I thought effects in Cryostasis, Mirror's Edge, Mafia II, Darkest of Days (one of my guilty pleasure games despite how rough around the edges it is), and both Batman games were nice graphical additions and it's a shame it's been such a limited case situation of using GPU-physx.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
AMD has designed this feature on Microsoft Direct Compute. There is no locking out of competitor. performance will depend on Nvidia's direct compute performance. also this game will give a good idea of Titan vs GTX 680 in compute performance. If titan is significantly faster than HD 7970 Ghz and GTX 680 is significantly slower than HD 7970 Ghz then Nvidia users can't complain.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
The day everyone get their crap together and comes out with a standard that is implemented across companies and actually makes it into games I'm buying a card just to use for compute performance. I think all the demos etc look great, but it means nothing if it is rarely supported.

That day may be sooner than you think. AMD is powering both of the next-gen consoles with a 78xx variant GPU and both will support GPU compute capabilities.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
I want this to be AMD exclusive, if not for anything but so all the physx haters can say how awesome of a feature this is, thus completing the hypocritical circle jerk.

I hope it's AMD exclusive so AMD can have an equal and competitive counter-punch to physx. If we want better competition among Nvidia and AMD in performance and price then AMD needs to play the same games Nvidia does. So far in the past year AMD has outdone Nvidia on the game bundles, developer support front. They need to continue to aggressively push back.

2 wrongs don't make a right, there are better ways to compete.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Hm, TressFX sounds exactly like nVidia's Hair demo only without Tessellation.

I guess it's only useful for one or two characters because you need to create all of the vertices on the CPU instead of the GPU.



If it run badly on nVidia hardware then it will have the same faith like Tessellation: You are losing 33-66% of the market because of the hardware.

Tessellation runs on both GPUS so nothing is being lost.
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91

Bit-tech confirmed it:

As we suspected, AMD's press release has been very carefully worded. 'TressFX is not exclusive to AMD,' a spokesperson for the company has told us. 'It works on any DirectX11 card, similar to some other AMD-built technologies - for example Order-Independent Transparency (OIT) or High Definition Ambient Occlusion (HDAO).' Thus is the truth revealed: any DirectX11-capable graphics hardware, including those from rival Nvidia, will be able to make use of AMD's hair-rendering know-how.

http://www.bit-tech.net/news/hardware/2013/02/26/amd-tressfx/1
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
I don't hat PhysX or nVidia but I do hate the way they handle it. By now they should have had PhysX in every game coming out. Even if they would allow nVidia GPUs to do the PhysX while another card (AMD or nVidia) is the primary card in the rig it would make PhysX a lot more viable. Right now we can't have PhysX in meaningful way (to the point that it affects gameplay) because it's proprietary.

proprietary itself is not the problem here as like you have said they could of allowed it to run on a NV card with an AMD card in the same rig. NV would still hold all the cards.
Agia physics was proprietary allowing it to work along side NV or AMD cards doesn't chance that fact.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
A feature does not have to start off being proprietary for awareness or momentum

If the proprietary feature or ability garners enough awareness and momentum hopefully standards are forged like with stereo 3d with DirectX 11.1!
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
If the proprietary feature or ability garners enough awareness and momentum hopefully standards are forged like with stereo 3d with DirectX 11.1!

Ain't the point the point is proprietary does not have to come first, if proprietary comes first is because purely of self interest and not the for the sake of the platform.
Self interest in business is a given but there are many ways to approach it and its goals can broader than just itself interest.

NV's purely Self interest.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Ain't the point the point is proprietary does not have to come first, if proprietary comes first is because of self interest and the for the sake of the platform.

Sure it does. If there is no public API, then companies need a way to introduce things.

Without CUDA there would be no DirectCompute and OpenCL. nVidia invented the first real GPGPU architecture and API-set.

3dfx did the same with Glide.

BTW: Videos from the console version are very underwhelming. Low-Poly environment and characters and no real effect physic system.
Who cares about the hair when the rest of the game looks so outdated...
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91
BTW: Videos from the console version are very underwhelming. Low-Poly environment and characters and no real effect physic system.
Who cares about the hair when the rest of the game looks so outdated...

Well... yeah. Because you're looking at the console version. :confused: