Nvidia offers PhysX support to AMD / ATI

BFG10K

Lifer
Aug 14, 2000
22,709
2,972
126
Yes, this is all well and good, but ATi needs to implement a back-end for it on their hardware, and I can't see them doing that to support a competitor?s tech.

This is what I meant in the other thread when I said a DirectX implementation would be better for the industry and for the consumer. With a DirectX implementation, any vendor that wanted to remain relevant would be forced to implement it. This would then standardize it for both developers and gamers.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
Yes, this is all well and good, but ATi needs to implement a back-end for it on their hardware, and I can't see them doing that to support a competitor?s tech.

I just did not want to derail the Havok thread. There was discussion over there about Physx not being available on all hardware. The only hardware it does not run on is ATI and it's most likely there fault. So if any blame is to be placed it would be on them.

The problem with a DX implementation is that there are no physics libraries for it. That would take a lot of time and money to establish and I have yet to hear that MS is going to do this.

Physx currently runs on virtually everything including the Wii. Using DX would limit physics to MS products. Which seems like going backwards from what we have now.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,972
126
Originally posted by: Wreckage

I just did not want to derail the Havok thread. There was discussion over there about Physx not being available on all hardware. The only hardware it does not run on is ATI and it's most likely there fault. So if any blame is to be placed it would be on them.
Yes certainly, I didn?t mean to imply I have issues with this thread, because I don?t. :)

Using DX would limit physics to MS products. Which seems like going backwards from what we have now.
That's one way of looking at it. The other is to look at where PC gaming would be without Direct3D and OpenGL. You?d go back to the days of Glide where almost every vendor had their own propriety API and was trying to sway developers to use it.

Once Direct3D and OpenGL became viable APIs that nonsense stopped and PC gaming started to move forward.

PhysX may be available on consoles, but unless ATi & Intel implement it, it?ll splinter PC gaming. A DirectX implementation will stop this nonsense as it?ll force everyone who wants to be a player to get on board. It?ll also foster competition between the IHVs because they?ll try to come up with a solution that works better than others?.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
If NVidia really wanted to do something, they'd license it to Microsoft for DX11 or DX12.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: Wreckage
I just did not want to derail the Havok thread. There was discussion over there about Physx not being available on all hardware. The only hardware it does not run on is ATI and it's most likely there fault. So if any blame is to be placed it would be on them.

Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?

The whole thing was a joke.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: BFG10K
PhysX may be available on consoles, but unless ATi & Intel implement it, it?ll splinter PC gaming. A DirectX implementation will stop this nonsense as it?ll force everyone who wants to be a player to get on board. It?ll also foster competition between the IHVs because they?ll try to come up with a solution that works better than others?.

Exactly. It would be preferable to see someone in control who wouldn't have the urge to artificially limit performance performance of one company's hardware to make another's look good. Having Microsoft in control would force each company simply to try and make their own solution look to be the best, which would be to the end user's benefit.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?

That is a rather short sighted look at that situation. An executive from IBM once told a snot nosed kid that he didn't care about software revenue because the real money was in hardware. If nVidia got themselves into a position of market dominance the potential revenue stream could be rather huge just from licensing fees. The nice thing about licensing is it has almost no overhead, the profit margins on it are absolutely staggering. Besides those factors though, as of right now I find it highly unlikely that ATi would be besting nV at this point, or even really being close to even. nV has clearly dedicated far more die space to GPGPU functions then ATi at this point.

BTW- the snot nosed kid, his name was Bill Gates.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Creig


Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?

The whole thing was a joke.

One could say the same thing about Havok.

PhysX already is an industry standard. If ATI chooses not to adopt that standard, it's their loss.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Creig
Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?

The whole thing was a joke.

Originally posted by: Creig
Exactly. It would be preferable to see someone in control who wouldn't have the urge to artificially limit performance performance of one company's hardware to make another's look good. Having Microsoft in control would force each company simply to try and make their own solution look to be the best, which would be to the end user's benefit.

That's an interesting take on it. So by that logic, I suppose Intel is crippling Havok's performance on AMD CPUs, Adobe is crippling their performance on AMD CPUs, Autodesk is crippling their software's performance on AMD CPU and GPUs, Stanford is crippling F@H's performance on AMD GPUs, Game developers are crippling their game's performance on AMD GPUs, etc. etc. Or maybe the reason AMD parts run slower is because they are generally slower than the competition. Just a novel thought, I could be way off though. LOL.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: chizow
That's an interesting take on it. So by that logic, I suppose Intel is crippling Havok's performance on AMD CPUs, Adobe is crippling their performance on AMD CPUs, Autodesk is crippling their software's performance on AMD CPU and GPUs, Stanford is crippling F@H's performance on AMD GPUs, Game developers are crippling their game's performance on AMD GPUs, etc. etc. Or maybe the reason AMD parts run slower is because they are generally slower than the competition. Just a novel thought, I could be way off though. LOL.
Don't be deliberately obtuse, you know exactly what I'm talking about. Adobe is not owned by Nvidia or AMD, Autodesk is not own by Nvidia or AMD, F@H is not owned by Nvidia or AMD, etc. None of those companies would have any reason to artificially lower performance on either AMD's or Nvidia's hardware, whereas Nvidia would have a very real reason to see to it that AMD cards never outperformed those produced by themselves in PhysX.



Originally posted by: chizow
Game developers are crippling their game's performance on AMD GPUs,
Interesting that you should mention this, however, because there was much talk that this exact scenario did in fact happen with the DX10.1 Assassin's Creed patch that was pulled and never reintroduced. Since Assassin's Creed is a TWIMTBP title and Nvidia does not have a DX10.1 GPU, a lot of speculation has occurred whether Nvidia pressured Ubisoft to pull DX10.1 support simply because it put AMD in a brighter spotlight than Nvidia. Was there ever any proof? No. And we may never know for sure. But it is definitely within Nvidia's modus operandi to pull something like that.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Creig
Don't be deliberately obtuse, you know exactly what I'm talking about. Adobe is not owned by Nvidia or AMD, Autodesk is not own by Nvidia or AMD, F@H is not owned by Nvidia or AMD, etc. None of those companies would have any reason to artificially lower performance on either AMD's or Nvidia's hardware, whereas Nvidia would have a very real reason to see to it that AMD cards never outperformed those produced by themselves in PhysX.
I'm not being deliberately obtuse, I've provided a very real example that proves your claims to be nothing more than unfounded fearmongering by comparing Intel's Havok support on Intel and AMD CPUs. I then extended this analogy to show there is no incentive for a software developer to deliberately cripple their software, particularly when they stand to gain more from a 100% adoption rate over any deliberate crippling of their competitor's hardware as Ben mentions above.

These extended examples also help establish the reality that there's no reason to believe performance on AMD parts is a result of anything other than AMD parts being generally slower than the competition as the products mentioned are not controlled by AMD or any of their direct competitors, meaning they have no incentive to cripple performance.

Interesting that you should mention this, however, because there was much talk that this exact scenario did in fact happen with the DX10.1 Assassin's Creed patch that was pulled and never reintroduced. Since Assassin's Creed is a TWIMTBP title and Nvidia does not have a DX10.1 GPU, a lot of speculation has occurred whether Nvidia pressured Ubisoft to pull DX10.1 support simply because it put AMD in a brighter spotlight than Nvidia. Was there ever any proof? No. And we may never know for sure. But it is definitely within Nvidia's modus operandi to pull something like that.
Rofl, Assassin's Creed again. There's really no need for me to rehash the entire argument here, its been well documented on public record for almost a year now:

Ubisoft comments on Assassin's Creed DX10.1 controversy - UPDATED - The Tech Report

There's direct quotes from both Ubisoft and Nvidia directly denying any influence whatsoever in the decision to pull DX10.1 support. Ubisoft they claims they pulled support due to rendering errors, which were verified by numerous external sources. Given the numerous problems with ATI's Z/multi sampling buffers in a wide variety of games this should really shouldn't be a surprise.

And don't forget, if you really, really want to run AC in DX10.1 in all its glory, sans a few render passes, you simply don't need to patch the game. Its perfectly playable start to finish unpatched.

As for Nvidia influencing the decision, what DX10.1 features do you think Nvidia parts are incapable of? Far Cry 2 showed Nvidia actually runs DX10.1 better than ATI does. ;) Not only do Nvidia parts take advantage of the performance gains from reading the MS depth buffer, a DX10.1 feature, their parts don't suffer from the stuttering that plagues ATI parts in DX10 since launch.

Based on Derek's DX11 article it seems as if the only thing that held Nvidia parts back from DX10.1 certification was hardware tesselation, which is not only unused, but apparently incompatible with the hardware on ATI's current DX10.1 parts.

 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
I know this isn't the Havok thread, but if this thread is about why physx won't become the industry standard, the answer to that is Havok.

Havok has extensive physics libraries, and if Havok can be gpgpu-accelerated through OpenCL, physx does not stand a chance, coz Havok will run on both ATI and Nvidia cards. Claiming that Physx > Havok in terms of how good it is, and how many libraries it has, is pure speculation. Havok is a very much adopted physics engine, in both pc games and console games. So don't instantly rule it out. Because it isn't marketed as much as PhysX, doesn't mean it sucks.
 

solofly

Banned
May 25, 2003
1,421
0
0
If I was MS/Intel/AMD I would piss on nvidia and their greed, I mean their physx...:D
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
Originally posted by: chizow
That's an interesting take on it. So by that logic, I suppose Intel is crippling Havok's performance on AMD CPUs, Adobe is crippling their performance on AMD CPUs, Autodesk is crippling their software's performance on AMD CPU and GPUs, Stanford is crippling F@H's performance on AMD GPUs, Game developers are crippling their game's performance on AMD GPUs, etc. etc. Or maybe the reason AMD parts run slower is because they are generally slower than the competition. Just a novel thought, I could be way off though. LOL.

Autodesk apps run smoother/faster on AMD GPUs than on nVidia's cards.

Wreckage - you pulled up an article over one year old to start a new thread? WTF?
Talk about old news...
 

spittledip

Diamond Member
Apr 23, 2005
4,480
1
81
Originally posted by: Wreckage
Originally posted by: Creig


Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?

The whole thing was a joke.

One could say the same thing about Havok.

PhysX already is an industry standard. If ATI chooses not to adopt that standard, it's their loss.

I agree with this somewhat. There need be only one standard. Having multiple standards makes things difficult for the end user. Everyone should be able to enjoy physics no matter what manufacturer makes the cards. There needs to be a universal standard, and for the sake of the customer, these companies should come together to make such a standard universal.

edit: seeing that physx is more mature than havok, it seems the obvious choice would be to go with physx.. barring any technical deficiencies that i am unaware of.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: chizow

There's direct quotes from both Ubisoft and Nvidia directly denying any influence whatsoever in the decision to pull DX10.1 support.

I don't think people were denying that.

What I think people were/are denying is whether or not those claims are true.

Time will tell.

Originally posted by: chizow

...what DX10.1 features do you think Nvidia parts are incapable of? Far Cry 2 showed Nvidia actually runs DX10.1 better than ATI does. ;) Not only do Nvidia parts take advantage of the performance gains from reading the MS depth buffer, a DX10.1 feature, their parts don't suffer from the stuttering that plagues ATI parts in DX10 since launch.

I don't think that's accurate. NVidia's current GPUs are not fully capable of DX10.1, preventing any direct comparisons from reputable sources.

What you're doing is insinuating that it runs an entire API better (DX10.1) by comparing a mere extension of that API.

Originally posted by: chizow

Based on Derek's DX11 article it seems as if the only thing that held Nvidia parts back from DX10.1 certification was hardware tesselation, which is not only unused, but apparently incompatible with the hardware on ATI's current DX10.1 parts.

From what I've read, outside of multisample depth buffer, nVidia hasn't disclosed what parts of the DX10.1 API their GPUs do and don't adhere to.

From: Toni Tamasi - Link

...I'd rather not say what [DX10.1] features we don't support.

Have they since then disclosed what they do and don't support from the DX10.1 API?

Also, if hardware tessellation is incompatible on ATi hardware as you say it is, but also required for 10.1 certification, why are they certified?
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Originally posted by: josh6079
Also, if hardware tessellation is incompatible on ATi hardware as you say it is, but also required for 10.1 certification, why are they certified?

I think he means incompatible with DX11.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,674
145
106
www.neftastic.com
Originally posted by: chizow
Based on Derek's DX11 article it seems as if the only thing that held Nvidia parts back from DX10.1 certification was hardware tesselation, which is not only unused, but apparently incompatible with the hardware on ATI's current DX10.1 parts.

Funny, the company I work for has software that uses hardware tesselation on AMD cards, and the developers absolutely love it.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
i can't believe how easily wreckage stirred up the hornets nest with a one year old article :p
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: thilan29
I think he means incompatible with DX11.
Yep.

Originally posted by: SunnyD
Funny, the company I work for has software that uses hardware tesselation on AMD cards, and the developers absolutely love it.
Sweet, when does the game ship and what's it called?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
SunnyD and Chizow, you guys should start another thread that discusses hardware tesselation on AMD cards. Chizow says hardware tesselation is apparently incompatible on ATI's current DX10.1 parts. SunnyD says the company he works for uses hardware tesselation on AMD cards and the devs absolutely love it.

One of you is wrong. Find out which one it is. In another thread. /2cents.
 

akugami

Diamond Member
Feb 14, 2005
5,666
1,856
136
Originally posted by: chizow
That's an interesting take on it. So by that logic, I suppose Intel is crippling Havok's performance on AMD CPUs, Adobe is crippling their performance on AMD CPUs, Autodesk is crippling their software's performance on AMD CPU and GPUs, Stanford is crippling F@H's performance on AMD GPUs, Game developers are crippling their game's performance on AMD GPUs, etc. etc. Or maybe the reason AMD parts run slower is because they are generally slower than the competition. Just a novel thought, I could be way off though. LOL.

I'm going to play devil's advocate here but wasn't MS sued in courts for exactly this reason with their hidden API's that accessed Windows XP in ways other software developers could not? The idea is not as far fetched as you may have others believe.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,674
145
106
www.neftastic.com
Originally posted by: Keysplayr
SunnyD and Chizow, you guys should start another thread that discusses hardware tesselation on AMD cards. Chizow says hardware tesselation is apparently incompatible on ATI's current DX10.1 parts. SunnyD says the company he works for uses hardware tesselation on AMD cards and the devs absolutely love it.

One of you is wrong. Find out which one it is. In another thread. /2cents.

No, we're both right in a sense. AMD's hardware tesselator in their DX10.1 parts will not be completely compatible with the tesselator spec to be put forth in DX11. Score one for chizow.

On the other hand - chizow stated that AMD's hardware tesselator is unused, which I was pointing out is totally and unequivocally incorrect. Score one for me (as if we were keeping score).

Just putting things in perspective to set the record straight.