**thread name change* Nvidia and AMD moral and immoral business practices

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
0
0
I'm not going to go into the whole physx issue again because as you said, it's been discussed to death and neither side seems willing to accost the other's arguments candidly. And by that I mean that you know it's obvious why amd can't go with nvidia's standards but don't mention it's obvious why nvidia can't bend over to amd's side; so n the end it shouldn't be hard to meet one another in the middle where nvidia doesn't make physx ati-compliant for the same reason ati doesn't make their cards physx compliant, because it's a textbook case of a mexican standoff both financially and in terms of losing face. That's something where neither camp can be justified with a precedent from another case.

Well I can think of a precedent:
How about AMD's CPU division? They rely completely on Intel's x86 instructionset, their biggest competitor.
I think that's about the strongest argument you can possibly think of. Because of AMD's success with x86 CPUs, they were able to acquire ATi and get into the GPU business in the first place. So if there's any company that shouldn't be too bothered about licensing their competitor's technology, it's AMD. That's pretty much their entire business model.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Well I can think of a precedent:
How about AMD's CPU division? They rely completely on Intel's x86 instructionset, their biggest competitor.
I think that's about the strongest argument you can possibly think of. Because of AMD's success with x86 CPUs, they were able to acquire ATi and get into the GPU business in the first place. So if there's any company that shouldn't be too bothered about licensing their competitor's technology, it's AMD. That's pretty much their entire business model.
You fail to understand how a cross-licensing agreement works. AMD and Intel (especially after their new agreement) are free to use one anothers tech without fear of legal action. Nvidia is in no way, shape or form offering a cross licensing agreement with AMD. You are making the claim that AMD is "free" to use PhysX (this is a lie) if they want to. Even if that was true, AMD would have absolutely no control or freedom, they would be 100% at the mercy of Nvidia.

This is not the case with the Intel/AMD agreement. AMD does NOT rely completely on Intel for the x86 ISA, for example they developed AMD64, and are free to extend, modify, or develop new extensions etc. at their will.

Of note, Intel is free to use any of AMD's graphics tech if they wish to do so.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Nvidia is in no way, shape or form offering a cross licensing agreement with AMD.

How do you know?

You are making the claim that AMD is "free" to use PhysX (this is a lie) if they want to.

You're accusing me of lies. Have any proof?

This is not the case with the Intel/AMD agreement. AMD does NOT rely completely on Intel for the x86 ISA, for example they developed AMD64, and are free to extend, modify, or develop new extensions etc. at their will.

AMD does rely on x86, because AMD64 is an extension. It cannot exist without x86.
Also, any extension that AMD develops has to be offered to Intel free of charge (while AMD pays Intel a percentage for every x86-CPU sold). So the x86 cross-license is not THAT friendly.
I don't see why nVidia could not have a similar license for PhysX.

Of note, Intel is free to use any of AMD's graphics tech if they wish to do so.

No they aren't. Heck, they wouldn't be licensing GPU technology from PowerVR or working on their own IGPs if they could use AMD's GPUs. It's not like any of Intel's IGPs were very competitive with AMD's offerings (except for their current ones integrated in the CPU).
 
Last edited:

zebrax2

Senior member
Nov 18, 2007
977
69
91
Well I can think of a precedent:
How about AMD's CPU division? They rely completely on Intel's x86 instructionset, their biggest competitor.
I think that's about the strongest argument you can possibly think of. Because of AMD's success with x86 CPUs, they were able to acquire ATi and get into the GPU business in the first place. So if there's any company that shouldn't be too bothered about licensing their competitor's technology, it's AMD. That's pretty much their entire business model.

1.) Intel is in a much more dominant position to AMD than Nvidia is to AMD.
2.) You can't just introduce hardware with no to very little software written to it when its main competitor has thousands of software written for it. Look at what happened to Itanium (correct me if i'm wrong). That is the reason they stick with x86 which isn't the case with for GPU accelerated physics.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
1.) Intel is in a much more dominant position to AMD than Nvidia is to AMD.

Agree. I've stated that at least twice in this thread already.

AMD has virtually no hope of dethroning Intel in x86, so AMD may as well cross-license with Intel (with AMD64).

By contrast, ATI is competitive with NV and doesn't have to kowtow to NV as much. ATI's consumer GPU market share actually rose after its rejection of PhysX, to the point where ATI actually has more market share than NV right now--and it will likely stay that way for at least another generation, if not more. ATI doesn't really need PhysX to compete right now, nor did it back then. Sure, collaborating with NV would have helped them, but it would have helped NV even more, since they control PhysX. ATI is now taking its sweet time developing an OpenCL alternative to PhysX.

This situation is NOT good for PC gaming, but I also can't really blame ATI for rejecting NV's PhysX offer.

Scali continues to think that ATI made a mistake in rejecting PhysX, though, and apparently this is something that you can not change his mind on, so if I were you, I'd stop trying.

Note that by AMD I mean the CPU division, ATI means the GPU division.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
2.) You can't just introduce hardware with no to very little software written to it when its main competitor has thousands of software written for it. Look at what happened to Itanium (correct me if i'm wrong). That is the reason they stick with x86 which isn't the case with for GPU accelerated physics.

Actually it is.
The physics market on the PC is pretty much divided between Havok and PhysX. With PhysX slowly but surely taking the lead in marketshare.
Since AMD already tried to get into Havok, and failed... it would seem that the only alternative to get into the GPU accelerated physics game is by means of PhysX.
Bullet is not going to do much, since hardly any game developer uses it. And I doubt that AMD has the leverage to change that (devrel is not exactly their forte).
 
Last edited:

zebrax2

Senior member
Nov 18, 2007
977
69
91
Actually it is.
The physics market on the PC is pretty much divided between Havok and PhysX. With PhysX slowly but surely taking the lead in marketshare.
Since AMD already tried to get into Havok, and failed... it would seem that the only alternative to get into the GPU accelerated physics game is by means of PhysX.
Bullet is not going to do much, since hardly any game developer uses it. And I doubt that AMD has the leverage to change that (devrel is not exactly their forte).

My post is more about GPU accelerated physx, havok and bullet. That market is what barely even 2-3 years old and developer adoption isn't that stellar either so who knows what would happen with it say the next say 5 years as it matures in which case Bullet may after all make it big time or maybe someone else would enter and then become the dominant player in it. My point is why would you surrender the market to your competitor(gaming hardware) at such an early stage when it is still in such an unpredictable stage.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I'm sure someone will come here talking about JHH and wood screws and such, but I agree that insider trading > wood screws in severity.

People complaining about wood screws haven't been to many product launches or trade shows, I bet.
I think it's perfectly acceptable to have a mock-up of a product, if the final product is not finished yet, as long as the mock-up tries to represent the final product accurately.

My favourite example is the Commodore Amiga.
They went to trade shows with a prototype, nicknamed Lorraine, which was basically a collection of circuit boards handwired together:
lorrainechips.jpg


They hid this under a table, and showed a mock up of a regular Amiga 1000, while Lorraine was doing the actual work.
They hadn't gone into production yet, so they could not fit their Amiga circuitry into the actual case yet.
But since the prototype was identical to the production model, aside from the form factor, I don't see anything wrong with that.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
My post is more about GPU accelerated physx, havok and bullet. That market is what barely even 2-3 years old and developer adoption isn't that stellar either so who knows what would happen with it say the next say 5 years as it matures in which case Bullet may after all make it big time or maybe someone else would enter and then become the dominant player in it. My point is why would you surrender the market to your competitor(gaming hardware) at such an early stage when it is still in such an unpredictable stage.

My point is that GPU acceleration is being used because developers were using PhysX already.
I don't think just because Bullet now has some basic GPU acceleration, developers will flock to using Bullet. They generally stick with what they know. The physics API is also integrated quite deeply into the game engine, so it's rather difficult to switch physics APIs without writing a new game engine from scratch (most developers just license a third-party engine anyway, and don't write their own... so it's really only up to the handful of popular engines used by virtually all games. Currently PhysX is mainly riding on the success of the popularity of the Unreal Engine).

So yes, we do know what would happen, that's my point. PhysX is facing the same problem in competing with Havok. Developers generally don't switch.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Which hardware accelerated physics engine wins has more to do with what the xbox 720 and ps4 support then anything that happens in the PC market. Bullet should work properly by then and both it and physx should be supported in hardware on those consoles (i.e. gpu accelerated). Havok looks likely to die unless Intel does a u-turn.

Then it comes down to which physics library major game engines decide to use - that comes down to cost, politics, and level of support.

Phsyx is almost certainly has a lot bigger dev team and a lot more historical experience so I'd expect it to win the most deals. Bullets independence might net it the odd win and if they get lucky (e.g UE4 chooses bullet) that might count for a lot. Some engines might develop their own physics libraries too.

If I were AMD I'd wait for that to happen - at that point physx will have been ported to opencl and directx compute too (required for PS4 and xbox 720). Then they just need to spruce up their opencl/dx compute drivers and they are in the game.

Until then hardware physics will stay marginal - a plus for nvidia users, but really only an afterthought for devs, who because they primarily develop for the consoles won't be interested putting much effort in.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
Which hardware accelerated physics engine wins has more to do with what the xbox 720 and ps4 support then anything that happens in the PC market.

I'm not too sure about that. For example, PS3 already supports physics acceleration because of its Cell processor. But that doesn't do the PC market any good. PCs don't have Cell processors (you generally code consoles on the bare metal, not through some inefficient layer like OpenCL).
Bullet is one of the most popular options on the PS3 platform, but hardly any PC games use it. Doesn't look like there's much of an influence of consoles on the PC market.

And what if it is the other way around? Since nVidia already has a physics solution, perhaps that could weigh in on the decision to go with nVidia hardware for next-gen consoles?

Or, if AMD is chosen instead, and a physics solution is developed, is there any guarantee that even if the port it to PC (which Microsoft/Sony might not want, as it would take away one of the advantages of their consoles... like how Microsoft likes to keep Halo XBox-only), that they're going to let nVidia use it?
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
And what if it is the other way around? Since nVidia already has a physics solution, perhaps that could weigh in on the decision to go with nVidia hardware for next-gen consoles?

Or, if AMD is chosen instead, and a physics solution is developed, is there any guarantee that even if the port it to PC (which Microsoft/Sony might not want, as it would take away one of the advantages of their consoles... like how Microsoft likes to keep Halo XBox-only), that they're going to let nVidia use it?

Microsoft wasn't happy with Nvidia after the first xBox which is why they went with AMD for the 360. There is little chance they're going to throw away all their R&D and partnerships with IBM & Glofo just to go back to the problems they had with Nvidia. Nintendo normally goes with AMD.

The only console Nvidia has gained a contract for was the PS3 & the next PSP. The the current hybrid CPU with a GPU core setup in the new 360 is similar to what Sony originally wanted with the PS3. So there is a good chance Sony could flip & go AMD with the PS4.

Microsoft is pretty much the only big name that even gave the Tegra a chance with the Zune series. People (analysts/Journalists) say that is in part do to Nvidia's arrogant attitude.

In my opinion Nvidia is getting a humbling that they've needed for some time. They're not going to die from this current slump nor do I want them to. But they need to chance their public attitude & perception.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Microsoft wasn't happy with Nvidia after the first xBox which is why they went with AMD for the 360. There is little chance they're going to throw away all their R&D and partnerships with IBM & Glofo just to go back to the problems they had with Nvidia. Nintendo normally goes with AMD.

The only console Nvidia has gained a contract for was the PS3 & the next PSP. The the current hybrid CPU with a GPU core setup in the new 360 is similar to what Sony originally wanted with the PS3. So there is a good chance Sony could flip & go AMD with the PS4.

Microsoft is pretty much the only big name that even gave the Tegra a chance with the Zune series. People (analysts/Journalists) say that is in part do to Nvidia's arrogant attitude.

In my opinion Nvidia is getting a humbling that they've needed for some time. They're not going to die from this current slump nor do I want them to. But they need to chance their public attitude & perception.

The only "problem" MS had with Nvidia for the original XBOX was that they didn't get better pricing once the die shrinks came to the GPU IIRC. It had nothing do with "problems" of quality or performance, as the XBOX was the most powerful console of its generation.

The current "hybrid" in the revised 360 is simply the result of a 5 year old GPU being die shrunk time and again. Its feasible because the Xenos is a joke compared to high end GPU's now (as is the RSX, not picking on ATI here just saying because of how old it is) New consoles are not going to have a combo cpu/gpu so Sony may or may not go with AMD, but its not going to have cpu/gpu combo when it launches so that won't be a factor in their decision. If you had to say one way or the other one would think Nv would have slight edge in getting PS4 because they already have the PS3, but of course it could go either way.

I don't believe any complaints levied against the Zune HD were because of its hardware. Most agreed that it performed well and was a quality unit, but simply doesn't have the ecosystem and support to compete with Apple.

Certainly Nvidia isn't in the position (on top of the world) that it was when the original Xbox came out, so they would probably be more willing to give MS terms that it likes to get a CHANCE at the Xbox 720 because they are not as dominant as they were in 2001 and console's are more important than ever. Odd's are MS sticks with AMD, but not because of your reasons. the Xbox 1 disputes are water under the bridge, and certainly they will (or already have) let Nv make their pitch for the 720.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I remember reading an article that ATI was already contracted to create the next Xbox chip. I'll have to dig it up.

I doubt Nintendo will stray away from ATI since it's been working indirectly with ATI for the last decade+ (SGI/ArtX in the N64, ArtX/ATI in the GC, ATI in the Wii.)

There are rumors suggesting that Sony will license ATI also. But the rumors aren't sustained and one of the arguments was to remove the bottleneck developers had for coding on their system. In the end Sony is the real mystery question as far as I'm concerned ATI has Nintendo and Microsoft secured for their next consoles.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
I remember reading an article that ATI was already contracted to create the next Xbox chip. I'll have to dig it up.

I doubt Nintendo will stray away from ATI since it's been working indirectly with ATI for the last decade+ (SGI/ArtX in the N64, ArtX/ATI in the GC, ATI in the Wii.)

There are rumors suggesting that Sony will license ATI also. But the rumors aren't sustained and one of the arguments was to remove the bottleneck developers had for coding on their system. In the end Sony is the real mystery question as far as I'm concerned ATI has Nintendo and Microsoft secured for their next consoles.

Nintendo would seem like a done deal no doubt. Looking at their history as you have shown, it would be unlikely they would make a change, I agree with your there. I also agree that Sony is the unknown.

As for MS, I also remember reading that they signed with AMD for 720 as well, but I don't believe it was confirmed by a reputable source. I think it was just a rumor. I haven't seen anything suggesting Sony will go with AMD, so please show us this one if you can find it.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
As for MS, I also remember reading that they signed with AMD for 720 as well, but I don't believe it was confirmed by a reputable source. I think it was just a rumor. I haven't seen anything suggesting Sony will go with AMD, so please show us this one if you can find it.

I tried to find the original/source of the stories covering the ATI+MS deal for the "Xbox 720" and they just keep pointing at each other with some pointing at FUDzilla. Unfortunately I can't find the FUDzilla one on their website. So, I guess you can consider it a rumor - until I can read that Fudzill article haha.

As for the Sony stuff it's just rumors circulating forums such as NG and stuff. Sometimes them guys are dead one, sometimes they are far off, in the end it's just rumors. Their arguements are understandable but at the same time, just speculative. I'd like to know who Sony signs though. If it is true that Tegra2 is in the PSP2, maybe a combination deal was done for a GPU for the PS4. Wait and see I guess.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
I tried to find the original/source of the stories covering the ATI+MS deal for the "Xbox 720" and they just keep pointing at each other with some pointing at FUDzilla. Unfortunately I can't find the FUDzilla one on their website. So, I guess you can consider it a rumor - until I can read that Fudzill article haha.

As for the Sony stuff it's just rumors circulating forums such as NG and stuff. Sometimes them guys are dead one, sometimes they are far off, in the end it's just rumors. Their arguements are understandable but at the same time, just speculative. I'd like to know who Sony signs though. If it is true that Tegra2 is in the PSP2, maybe a combination deal was done for a GPU for the PS4. Wait and see I guess.

Thanks for the update! I remember some of that from Fudo now that you mention it. It will be interesting to see the way things turn out!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't think it's some conspiracy, just that some people are true believers in their respective camps, and NV true believers are miffed that AMD seemingly gets less flak than NV--even if NV deserves it.

I have to agree with the NV true believers that some of the mudslinging is reaching. Yeah NV is annoying when it comes to disabling PhysX when rival cards are present, or not helping with AA on AMD hardware in Batman, etc. But that's not that big of a deal in my book, compared to other stuff. Plus, wouldn't AMD do the same if they had the advantage? I said on another thread that it's possible that AMD would try to lock NV out of PhysX if AMD were the one with the rights and larger historical market share, only for someone to tell me that AMD would never do that because its all about open standards. (I don't know about that. Historically companies with large market shares feel emboldened to set standards there way, whereas smaller-share competitors cry for open standards, in part because that would mitigate the large-share rival's ability to leverage its large market share.)

Other stuff is more serious. NV true believers should understand that things like bumpgate, regardless of exactly how evil it was or wasn't, can seriously undermine consumer trust in a brand. Ditto with viral marketers that don't state who they are. Heck, even JHH acting like a jerk can be a serious detriment to the company's popularity despite it not being technically evil for him to be a jerk--I know that I for one was very turned off by JHH's "can of whoop ass" comments and intelsinsides.com and such. NV's price-fixing offer doesn't make the company look good, either, even if ATI went along with it, because it was NV making the offer, not the other way around, apparently.

For a long time I was a NV true believer, mainly due to bugs in Warcraft III on a Radeon chip* that made me feel that NV's drivers were superior and less buggy than ATI drivers. (When it came time to get a card again, I refused to even consider ATI, figuring that I'd rather pay extra to get a NV card just to avoid another driver mess like that. And in fact my NV GeForce 6800XT only glitched up once, ever, in the opening scene of NWN. So at least back then, NV had the better drivers for me, though ATI has caught up in single-GPU drivers since then.)

I left PC gaming for a couple of years, then returned and was going to buy a used 8800GT when I learned about bumpgate. Despite my loathing for ATI drivers, I was more concerned about prematurely dead hardware than the occasional driver glitch, so I held my nose and got a HD4670 to hold the fort till I got a Fermi card. To my surprise, it worked pretty well. Then Fermi was obviously going to be late, so I wound up getting a HD5850 which worked great.

Let me repeat that last part, because it's important: due to bumpgate, I reluctantly felt compelled to get a HD4670. My experience with that helped put me at ease with ATI's drivers, enough so that instead of waiting for Fermi I got a Cypress. It all goes back to bumpgate for me. If bumpgate didn't happen, I would probably STILL be insisting that NV drivers were better than ATI drivers and would STILL never consider an ATI card. In other words, I'd still be a true believer.

I'm sharing this story because if it was possible for me, a onetime true believer in NV, to de-convert, that should give true believers in NV some food for thought: some people don't like NV much, not because of merely annoying stuff like PhysX being locked out, but because of more serious things like the price-fixing attempt, NV's slow reaction to bumpgate, and JHH's arrogance driving them into the arms of the competition--just like how ATI's driver glitch drove me into the arms of NV in the first place.

I don't speak for everyone, but if NV wants to regain its standing, maybe the CEO ought to be more humble and make GPUs that don't run so hot. I have a long memory when it comes to hardware failure, and thanks to bumpgate, I would never buy a new or used GTX4xx unless it were watercooled or something. I'm glad that NV cleaned up its viral marketing though; at least I'm assuming that they did, as I hadn't even heard about that till this thread. And I haven't heard of NV price-fixing attempts since the settlement.

Anyway, neither NV or ATI is the devil, and as PC gamers we need both to be healthy and competitive in order to keep a lid on prices. In a way, though, I am happy there is some NV-ATI antagonism at the companies themselves, because I'd rather they hated each other than if they price-fixed their cards so that, say, GTX480s and HD5870s cost twice what they do today.

Scali decried me as AMD-biased but I think if he actually listened to people, he might understand better just why NV has lost a lot of popularity in recent times. It's not necessarily just about the small stuff like PhysX lockouts, which never factored into MY buying decisions, for instance.

Some of NV's current doghouse status has to do with speed, too. If NV had a de-flawed halo part and price/perf lead up and down the entire product range, I bet NV's would be forgiven really fast for its past transgressions. ;)

* Certain effects such as the blizzard spell were invisible. Meaning, I could not see where those storms occurred and thus walked into them a lot. It was very annoying, as you can imagine. Sometimes the will-o-wisp things would disappear, too. Ugh.

I don't understand the logic of the true believer but what I do believe is what the products offer and what the products do as a whole. That is the most important aspect for me.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I'm not too sure about that. For example, PS3 already supports physics acceleration because of its Cell processor. But that doesn't do the PC market any good. PCs don't have Cell processors (you generally code consoles on the bare metal, not through some inefficient layer like OpenCL).
Bullet is one of the most popular options on the PS3 platform, but hardly any PC games use it. Doesn't look like there's much of an influence of consoles on the PC market.
Well the PC ports are mostly from the xbox version of the game as that's the closest console to a PC. They generally have a single thread'd cpu physics implementation. The cell, for all it's hype doesn't seem to be able to out-process the xbox as far as games are concerned so it's effective physics power is about the same. As you say neither use their gpu's for physics. Hence the consoles effectively give pc's games with single threaded cpu physics implementations.

And what if it is the other way around? Since nVidia already has a physics solution, perhaps that could weigh in on the decision to go with nVidia hardware for next-gen consoles?
I am assuming that the next gen consoles will support gpgpu compute in hardware. The choices are obviously open cl, direct x compute, and nvidia's cuda. If they wanted to use cuda that would obviously suit nvidia but really will MS ever go with anything but direct x compute, or sony with open cl?

Or, if AMD is chosen instead, and a physics solution is developed, is there any guarantee that even if the port it to PC (which Microsoft/Sony might not want, as it would take away one of the advantages of their consoles... like how Microsoft likes to keep Halo XBox-only), that they're going to let nVidia use it?
Physx is just middleware that could run on any gpgpu low level library if nvidia wanted it too. I don't think the console makers ever have much control over middleware - they can *buy* exclusivity for some games but that's about it - e.g. GOW is xbox only, but it uses UE3 (that supports physx) which is available for both consoles and the PC.
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
To me, it offers insight that there were talks about PhysX -- that AMD has real interest in PhysX or they wouldn't have any talks.

Well, aside from the part that AMD contradicts itself in public communications, I would argue that they don't have real interest in PhysX, or else they would have come to a deal. I would say: They didn't try hard enough.
For all we know, the PhysX talks (if any, as I said, I'm not convinced, because of Cheng denying them, and Cheng is less unreliable than Huddy) were just a scheme to put more pressure on Havok and get a better deal out of it.
Certainly AMD has never said anything even remotely positive about PhysX in the media.
If AMD was really interested, they'd go back to negotiating with nVidia after Havok fell through.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I am assuming that the next gen consoles will support gpgpu compute in hardware. The choices are obviously open cl, direct x compute, and nvidia's cuda. If they wanted to use cuda that would obviously suit nvidia but really will MS ever go with anything but direct x compute, or sony with open cl?

MS doesn't really have a say in that. Just like they cannot stop Cuda on Windows. If they go with nVidia, then the hardware will support Cuda, and every developer is then free to link to Cuda libraries, and will probably opt for this, as it is a more advanced solution (easier to develop with) and will deliver better performance, because it is tailor-made for the hardware.

Physx is just middleware that could run on any gpgpu low level library if nvidia wanted it too. I don't think the console makers ever have much control over middleware - they can *buy* exclusivity for some games but that's about it - e.g. GOW is xbox only, but it uses UE3 (that supports physx) which is available for both consoles and the PC.

They can also negotiate exclusivity of certain middleware as part of the hardware deal.
They probably cannot do that with PhysX anymore, because it is already an established standard on PC. But something new for AMD hardware, heck, perhaps Microsoft would even have to develop it themselves (we know that AMD hasn't been able to produce any physics middleware even though they've been trying for years... Ever looked at the Bullet repository? There's basically one AMD guy contributing (Lee Howes), and he hasn't exactly done much... AMD also has not fixed the point sprite bugs reported by Erwin Coumans in March(!)).
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
MS doesn't really have a say in that. Just like they cannot stop Cuda on Windows. If they go with nVidia, then the hardware will support Cuda, and every developer is then free to link to Cuda libraries, and will probably opt for this, as it is a more advanced solution (easier to develop with) and will deliver better performance, because it is tailor-made for the hardware.

I'd believe MS has a say what hardware goes into the next Xbox. If they were to use an ATI based GPU (which I believe is highly true) they won't have access to CUDA.

Maybe even taking a poll from developers and what has CUDA/PhysX delivered to the platform (very miniscule at this point) they might not opt to even support GPU acceleration in consoles. And if this is the case I'd wager Nintendo and Sony will follow suit.

If these opinions were to become true we'd face yet another generation of stagnation with nVidia/ATI having to deliver extra polish to console ports.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I'd believe MS has a say what hardware goes into the next Xbox. If they were to use an ATI based GPU (which I believe is highly true) they won't have access to CUDA.

You might want to read my post more clearly.
"If they go with nVidia, then the hardware will support Cuda, and every developer is then free to link to Cuda libraries"
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
You might want to read my post more clearly.
"If they go with nVidia, then the hardware will support Cuda, and every developer is then free to link to Cuda libraries"

You might want to read your own post clearly.

Scali said:
MS doesn't really have a say in that.