Let's play what if: AMD owned PhysX instead of NV

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
From what I've seen of how tech AMD/ATI worked on goes, this is my honest opinion:

AMD would have developed an SDK for PhysX. They would shop it around to devs would then try it and get mediocre results. Then Microsoft would see it and pick it up and it would become a feature for an upcoming DirectX build.

Pretty much any that ATI/AMD developed to stand out got adopted into DirectX. At least from the stuff I've seen.

ATI pushed Tessellation back in the old days, ignored until a feature in DX11.
ATI pushed unified shaders and MSFT ran with it for Xbox 360, then adopted into DX9.
ATI/AMD and their Mantle thingie-ma-bob and bam DX12.

Yeah, I don't think AMD/ATI would have make money off the acquisition, but I'd bet money we as gamers would benefit greatly from it. And it would go down into history as MSFT bring us what we want haha.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
From what I've seen of how tech AMD/ATI worked on goes, this is my honest opinion:

AMD would have developed an SDK for PhysX. They would shop it around to devs would then try it and get mediocre results. Then Microsoft would see it and pick it up and it would become a feature for an upcoming DirectX build.

Pretty much any that ATI/AMD developed to stand out got adopted into DirectX. At least from the stuff I've seen.

ATI pushed Tessellation back in the old days, ignored until a feature in DX11.
ATI pushed unified shaders and MSFT ran with it for Xbox 360, then adopted into DX9.
ATI/AMD and their Mantle thingie-ma-bob and bam DX12.

Yeah, I don't think AMD/ATI would have make money off the acquisition, but I'd bet money we as gamers would benefit greatly from it. And it would go down into history as MSFT bring us what we want haha.

It would pretty much end in nothing. Remember this?

physics.jpg

MechaRampagePhysics.jpg


And it goes all the way back to this:
atinphsbndlgmg_l1.jpg


Yet ZERO result like with so many other projects.
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Physx was never Gameworks. Is proprietary and locked, but don't hurts competitor performance and drives the industry forward.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
ATI pushed Tessellation back in the old days, ignored until a feature in DX11.



High Order Surfaces wasn't ignored and ATI and nVidia offered N-patches and RT patches, respectively with early DirectX iterations.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
High Order Surfaces wasn't ignored and ATI and nVidia offered N-patches and RT patches, respectively with early DirectX iterations.

Yet, how many devs implemented? When I say "ignored" I'm implying poorly implemented. And Nvidia really only through that into because it was used in the Quake! (probably the biggest FPS on PC at that time).

The tech went largely unused and basically died until it was a bullet point in DX11 and hardware specs. Then we got unseen tessellated oceans (haha.).
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
It would pretty much end in nothing. Remember this?

physics.jpg

MechaRampagePhysics.jpg


And it goes all the way back to this:
atinphsbndlgmg_l1.jpg


Yet ZERO result like with so many other projects.

Difference here is, AMD/ATI didn't own that. The techs I listed AMD/ATI sort of "created."
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It would pretty much end in nothing. Remember this?

physics.jpg

MechaRampagePhysics.jpg


And it goes all the way back to this:
atinphsbndlgmg_l1.jpg


Yet ZERO result like with so many other projects.

I remember and had a CrossFire 1900XTX platform and was looking forward to Gpu Physic flexibility. Instead of offering this was dead -- nVidia risked resources -- AMD basically talked and attacked.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
I remember and had a CrossFire 1900XTX platform and was looking forward to Gpu Physic flexibility. Instead of offering this was dead -- nVidia risked resources -- AMD basically talked and attacked.

That's typical of AMD, they take the easiest and least risky path. Just look at FailSync, their 3D support, Mantle, their developer relations etc. Why people keep buying into AMD PR and purchasing their hardware is beyond me.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
That's typical of AMD, they take the easiest and least risky path. Just look at FailSync, their 3D support, Mantle, their developer relations etc. Why people keep buying into AMD PR and purchasing their hardware is beyond me.

Odd, I'm feeling the same way seeing the Witcher 3 results. For some odd reason my great card gets destroyed by a GTX 960.

Makes me wonder if Maxwell to Pascal would so something similar.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Odd, I'm feeling the same way seeing the Witcher 3 results. For some odd reason my great card gets destroyed by a GTX 960.Makes me wonder if Maxwell to Pascal would so something similar.

This is the more important question for Nvidia users. Is Nvidia going to force product obsolescence of their previous gen by dropping performance optimizations for the latest games and thus trick the consumer to upgrade more often. :thumbsup:
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
It would pretty much end in nothing. Remember this?

http://shintai.ambition.cz/physics.jpg[/img
[IMG]http://www.gamephys.com/wp-content/uploads/2010/11/MechaRampagePhysics.jpg[/IMG

And it goes all the way back to this:
[IMG]http://megagames.com/sites/default/files/game-content-images/atinphsbndlgmg_l1.jpg[/IMG

Yet ZERO result like with so many other projects.[/QUOTE]

it happens. was a nice idea, but they don't own havok, or bullet physics or that other thing. Would have depended on those groups playing nice and the industry moving towards that standard. Was a good idea and is how it should end up.

I think they need their own physics API to really get it through.

Physx is unimpressive. My guess is it would take a lot of changes to get it up to standard. AMD shoulda got havok

[quote="5150Joker, post: 37410005"]That's typical of AMD, they take the easiest and least risky path. Just look at FailSync, their 3D support, Mantle, their developer relations etc. Why people keep buying into AMD PR and purchasing their hardware is beyond me.[/QUOTE]


Because we aren't buying based on PR like nvidia owners who are now getting crap performance from their $600 up cards.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Odd, I'm feeling the same way seeing the Witcher 3 results. For some odd reason my great card gets destroyed by a GTX 960.

Makes me wonder if Maxwell to Pascal would so something similar.

Pascal and Maxwells hould be relatively similar uarch wise. But since Pascal will use HBM2 as baseline. It only takes some developer/benchmark to utilize it before you see the same. Remember the settings used on these benches in the first place.

This is the more important question for Nvidia users. Is Nvidia going to force product obsolescence of their previous gen by dropping performance optimizations for the latest games and thus trick the consumer to upgrade more often. :thumbsup:

GCN 1.0 is pretty much suffering the same fate now. While benefitting in terms of some weakpoints due to console. The other weakpoints that GCN 1.1 and above solved will only make it worse.

Add a good bunch of tessellation and you can throw out the GCN 1.0 cards while GCN 1.1 hangs in a thin thread vs GCN 1.2. Or color/pixel fillrate for example.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
It would pretty much end in nothing. Remember this?

physics.jpg

MechaRampagePhysics.jpg


And it goes all the way back to this:
atinphsbndlgmg_l1.jpg


Yet ZERO result like with so many other projects.

I think this is a great example of an open source physics API that allows a hardware vendor to take the high road while it amounts to nothing for end users. If you look at the list of games that use Bullet, GPU PhysX has seen more market penetration. Never mind the CPU side of PhysX and Havok.

And this is also another example of AMD talking the talk, then not walking the walk.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,528
136
It would either be dead, after which hopefully another company would come fill the void, or an open standard.

Both are probably preferable to what we have now, but the future is not ours to see.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Odd, I'm feeling the same way seeing the Witcher 3 results. For some odd reason my great card gets destroyed by a GTX 960.

Makes me wonder if Maxwell to Pascal would so something similar.

It's a fair point to raise and not pleased with the performance as of late with Kepler -- was grateful for DSR for older generations, but utterly surprised by the questionable Kepler performance ------ maybe someone will ask nVidia.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It's a fair point to raise and not pleased with the performance as of late with Kepler -- was grateful for DSR for older generations, but utterly surprised by the questionable Kepler performance ------ maybe someone will ask nVidia.

It seems more likely the problem is the Kepler cards simply did not have the compute power of newer cards, which benchmarks have shown a large disparity. Nvidia more likely just didn't guess right on what would be important in todays games, which they rectified with Maxwell.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Difference here is, AMD/ATI didn't own that. The techs I listed AMD/ATI sort of "created."

Nvidia didn't create PhysX, they bought it and further developed it. AMD talked about Bullet/Havok etc and did nothing. They should have bought Bullet and pushed it as the open alternative to Nvidia PhysX & Intel Havok.

Personally I think AMD would have done nothing with PhysX as they'd have overpaid then had writedowns and eventually sold it for peanuts only for it to then get some support.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
That's typical of AMD, they take the easiest and least risky path. Just look at FailSync, their 3D support, Mantle, their developer relations etc. Why people keep buying into AMD PR and purchasing their hardware is beyond me.

One could also question why one would buy into NVIDIA cards when a year later their $200 GPU (960) outperforms their $699 (780 Ti) high-end GPU. As for failsync? Really? I'm glad there is an alternative for people purchasing monitors that might want a similar feature without having to drop an extra $200 for NVIDIA's add-in chip.

GCN 1.0 chips might be suffering the same performance hit but AMD wasn't selling them for $700 last year..
 
Last edited:

SimianR

Senior member
Mar 10, 2011
609
16
81
You're right actually, the 780 Ti still holds the lead - but not a convincing one considering its price. NVIDIA is making a tough case for their high ends parts, $300 more at launch yet the R9 290 edges it out here.

2nvv7es.jpg


It probably wouldn't sting so much if they hadn't priced it at $700.. had it been closer to the R9 290X there really wouldn't be any argument.
 

bowler484

Member
Jan 5, 2014
26
0
0
We don't really have to imagine.

Look at TressFX and what they did or rather didn't do.

If AMD put as much effort into tressfx as they do into badmouthing gameworks, we'd have some pretty good amd based features.