- Jul 1, 2005
- 5,529
- 0
- 0
http://www.custompc.co.uk/news...pport-to-amd--ati.html
This is an old article but I did not want to derail the Havok thread.
This is an old article but I did not want to derail the Havok thread.
Originally posted by: BFG10K
Yes, this is all well and good, but ATi needs to implement a back-end for it on their hardware, and I can't see them doing that to support a competitor?s tech.
Yes certainly, I didn?t mean to imply I have issues with this thread, because I don?t.Originally posted by: Wreckage
I just did not want to derail the Havok thread. There was discussion over there about Physx not being available on all hardware. The only hardware it does not run on is ATI and it's most likely there fault. So if any blame is to be placed it would be on them.
That's one way of looking at it. The other is to look at where PC gaming would be without Direct3D and OpenGL. You?d go back to the days of Glide where almost every vendor had their own propriety API and was trying to sway developers to use it.Using DX would limit physics to MS products. Which seems like going backwards from what we have now.
Originally posted by: Wreckage
I just did not want to derail the Havok thread. There was discussion over there about Physx not being available on all hardware. The only hardware it does not run on is ATI and it's most likely there fault. So if any blame is to be placed it would be on them.
Originally posted by: BFG10K
PhysX may be available on consoles, but unless ATi & Intel implement it, it?ll splinter PC gaming. A DirectX implementation will stop this nonsense as it?ll force everyone who wants to be a player to get on board. It?ll also foster competition between the IHVs because they?ll try to come up with a solution that works better than others?.
Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?
Originally posted by: Creig
Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?
The whole thing was a joke.
Originally posted by: Creig
Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?
The whole thing was a joke.
Originally posted by: Creig
Exactly. It would be preferable to see someone in control who wouldn't have the urge to artificially limit performance performance of one company's hardware to make another's look good. Having Microsoft in control would force each company simply to try and make their own solution look to be the best, which would be to the end user's benefit.
Don't be deliberately obtuse, you know exactly what I'm talking about. Adobe is not owned by Nvidia or AMD, Autodesk is not own by Nvidia or AMD, F@H is not owned by Nvidia or AMD, etc. None of those companies would have any reason to artificially lower performance on either AMD's or Nvidia's hardware, whereas Nvidia would have a very real reason to see to it that AMD cards never outperformed those produced by themselves in PhysX.Originally posted by: chizow
That's an interesting take on it. So by that logic, I suppose Intel is crippling Havok's performance on AMD CPUs, Adobe is crippling their performance on AMD CPUs, Autodesk is crippling their software's performance on AMD CPU and GPUs, Stanford is crippling F@H's performance on AMD GPUs, Game developers are crippling their game's performance on AMD GPUs, etc. etc. Or maybe the reason AMD parts run slower is because they are generally slower than the competition. Just a novel thought, I could be way off though. LOL.
Interesting that you should mention this, however, because there was much talk that this exact scenario did in fact happen with the DX10.1 Assassin's Creed patch that was pulled and never reintroduced. Since Assassin's Creed is a TWIMTBP title and Nvidia does not have a DX10.1 GPU, a lot of speculation has occurred whether Nvidia pressured Ubisoft to pull DX10.1 support simply because it put AMD in a brighter spotlight than Nvidia. Was there ever any proof? No. And we may never know for sure. But it is definitely within Nvidia's modus operandi to pull something like that.Originally posted by: chizow
Game developers are crippling their game's performance on AMD GPUs,
I'm not being deliberately obtuse, I've provided a very real example that proves your claims to be nothing more than unfounded fearmongering by comparing Intel's Havok support on Intel and AMD CPUs. I then extended this analogy to show there is no incentive for a software developer to deliberately cripple their software, particularly when they stand to gain more from a 100% adoption rate over any deliberate crippling of their competitor's hardware as Ben mentions above.Originally posted by: Creig
Don't be deliberately obtuse, you know exactly what I'm talking about. Adobe is not owned by Nvidia or AMD, Autodesk is not own by Nvidia or AMD, F@H is not owned by Nvidia or AMD, etc. None of those companies would have any reason to artificially lower performance on either AMD's or Nvidia's hardware, whereas Nvidia would have a very real reason to see to it that AMD cards never outperformed those produced by themselves in PhysX.
Rofl, Assassin's Creed again. There's really no need for me to rehash the entire argument here, its been well documented on public record for almost a year now:Interesting that you should mention this, however, because there was much talk that this exact scenario did in fact happen with the DX10.1 Assassin's Creed patch that was pulled and never reintroduced. Since Assassin's Creed is a TWIMTBP title and Nvidia does not have a DX10.1 GPU, a lot of speculation has occurred whether Nvidia pressured Ubisoft to pull DX10.1 support simply because it put AMD in a brighter spotlight than Nvidia. Was there ever any proof? No. And we may never know for sure. But it is definitely within Nvidia's modus operandi to pull something like that.
Originally posted by: chizow
That's an interesting take on it. So by that logic, I suppose Intel is crippling Havok's performance on AMD CPUs, Adobe is crippling their performance on AMD CPUs, Autodesk is crippling their software's performance on AMD CPU and GPUs, Stanford is crippling F@H's performance on AMD GPUs, Game developers are crippling their game's performance on AMD GPUs, etc. etc. Or maybe the reason AMD parts run slower is because they are generally slower than the competition. Just a novel thought, I could be way off though. LOL.
Originally posted by: Wreckage
Originally posted by: Creig
Nvidia offering PhysX to AMD was simply a publicity stunt. Nvidia knew full well AMD would never accept it. There is no way Nvidia would ever allow PhysX to run as good (or better) on AMD hardware than their own. So why should AMD even bother?
The whole thing was a joke.
One could say the same thing about Havok.
PhysX already is an industry standard. If ATI chooses not to adopt that standard, it's their loss.
Originally posted by: chizow
There's direct quotes from both Ubisoft and Nvidia directly denying any influence whatsoever in the decision to pull DX10.1 support.
Originally posted by: chizow
...what DX10.1 features do you think Nvidia parts are incapable of? Far Cry 2 showed Nvidia actually runs DX10.1 better than ATI does. Not only do Nvidia parts take advantage of the performance gains from reading the MS depth buffer, a DX10.1 feature, their parts don't suffer from the stuttering that plagues ATI parts in DX10 since launch.
Originally posted by: chizow
Based on Derek's DX11 article it seems as if the only thing that held Nvidia parts back from DX10.1 certification was hardware tesselation, which is not only unused, but apparently incompatible with the hardware on ATI's current DX10.1 parts.
Originally posted by: josh6079
Also, if hardware tessellation is incompatible on ATi hardware as you say it is, but also required for 10.1 certification, why are they certified?
Originally posted by: chizow
Based on Derek's DX11 article it seems as if the only thing that held Nvidia parts back from DX10.1 certification was hardware tesselation, which is not only unused, but apparently incompatible with the hardware on ATI's current DX10.1 parts.
Yep.Originally posted by: thilan29
I think he means incompatible with DX11.
Sweet, when does the game ship and what's it called?Originally posted by: SunnyD
Funny, the company I work for has software that uses hardware tesselation on AMD cards, and the developers absolutely love it.
Originally posted by: chizow
That's an interesting take on it. So by that logic, I suppose Intel is crippling Havok's performance on AMD CPUs, Adobe is crippling their performance on AMD CPUs, Autodesk is crippling their software's performance on AMD CPU and GPUs, Stanford is crippling F@H's performance on AMD GPUs, Game developers are crippling their game's performance on AMD GPUs, etc. etc. Or maybe the reason AMD parts run slower is because they are generally slower than the competition. Just a novel thought, I could be way off though. LOL.
Originally posted by: Keysplayr
SunnyD and Chizow, you guys should start another thread that discusses hardware tesselation on AMD cards. Chizow says hardware tesselation is apparently incompatible on ATI's current DX10.1 parts. SunnyD says the company he works for uses hardware tesselation on AMD cards and the devs absolutely love it.
One of you is wrong. Find out which one it is. In another thread. /2cents.