**thread name change* Nvidia and AMD moral and immoral business practices

Page 26 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
They can also negotiate exclusivity of certain middleware as part of the hardware deal.
They probably cannot do that with PhysX anymore, because it is already an established standard on PC. But something new for AMD hardware, heck, perhaps Microsoft would even have to develop it themselves (we know that AMD hasn't been able to produce any physics middleware even though they've been trying for years... Ever looked at the Bullet repository? There's basically one AMD guy contributing (Lee Howes), and he hasn't exactly done much... AMD also has not fixed the point sprite bugs reported by Erwin Coumans in March(!)).

Disclaimer: This is not a personal attack on you so don't take it that way. I'm just asking you as you seem to be an active participant in this thread. You seem like you are knowledgable and might have insight others want to read.

Just because AMD didn't publicaly announce anything yet doesn't mean they don't have something in the works. It's possible they are looking for some solution to the problem/equasion that doesn't involve the massive framerate drops or the need for a dedicated card to get the benefits of the physX style eye candy of physics. Maybe they are looking for a way to make physics trully change gameplay without the penalty and feel that with a substantial performance hit it's time isn't now.

So what is your take on Manju Hegde, Nvidia VP for PhsyX and CUDA Marketing and previously CEO and co-founder of PhysX inventor Ageia prior to its Nvidia buyout, has gone to AMD. http://www.bit-tech.net/news/hardware/2010/05/26/physx-founder-leaves-nvidia-for-amd/1

I was just wondering if you thought he did it for the money? We have been told time and time again that AMD doesn't have the funding that nvidia has.

Do you think he did it because Nvidia killed his dream?

Do you think that when nvidia nuked the drives disabling physX when AMD card is the primary GPU was the last straw for him?

Do you think with his help AMD can jump into physics and make an impact?

Do you think that he knows physX is a thing from the past that has no future?

I know it's alot of questions but others can jump in and share insight on what they think the answers would/could be. Why they think he went to AMD, etc.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Well, aside from the part that AMD contradicts itself in public communications, I would argue that they don't have real interest in PhysX, or else they would have come to a deal. I would say: They didn't try hard enough.
For all we know, the PhysX talks (if any, as I said, I'm not convinced, because of Cheng denying them, and Cheng is less unreliable than Huddy) were just a scheme to put more pressure on Havok and get a better deal out of it.
Certainly AMD has never said anything even remotely positive about PhysX in the media.
If AMD was really interested, they'd go back to negotiating with nVidia after Havok fell through.

It's just that when there are features or abilities that may give a company a competitive advantage, well, the other at time may loudly try to fuel negative opinion and try to rip it apart. That is why these discussions were quiet so-to-speak to me.
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
AMD does rely on x86, because AMD64 is an extension. It cannot exist without x86.
And Intel would be dead in the water without AMD64. It works both ways.
Also, any extension that AMD develops has to be offered to Intel free of charge (while AMD pays Intel a percentage for every x86-CPU sold). So the x86 cross-license is not THAT friendly.
I don't see why nVidia could not have a similar license for PhysX.
Could is far different that would, or does. Get this through your head, Nvidia does NOT offer any kind of cross licensing agreement for PhysX, and absolutely does NOT offer ANY agreement to AMD so their GPUs can accelerate PhysX. Period. Done.
No they aren't. Heck, they wouldn't be licensing GPU technology from PowerVR or working on their own IGPs if they could use AMD's GPUs.
Dirk Meyer was asked directly during a conference call if the new agreement between AMD and Intel also included ATI's IP. His answer was a simple "yes". So again you are wrong, and did not bother to check the facts.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The the current hybrid CPU with a GPU core setup in the new 360 is similar to what Sony originally wanted with the PS3.

Just wanted to point out that this isn't in the realm of being close to accurate. What Sony was originally planning on doing with the PS3 is what Intel wanted to do with Larrabee- fairly late in development they came upon the realization that it was utterly moronic to a fairly shocking degree and decided to go with a fast solution which is how they ended up with a modified 7800GT instead of something more tailor made for their system(Sony owns the IP on the GS so they could have had nV make something like a G80+GS if they had given them more time).

The cell, for all it's hype doesn't seem to be able to out-process the xbox as far as games are concerned so it's effective physics power is about the same.

Compare the particle system in KZ2 to anything on the 360- it isn't even close. Cross platform ports aren't going to see Cell pushed, the platform exclusive titles show something rather different.

I am assuming that the next gen consoles will support gpgpu compute in hardware. The choices are obviously open cl, direct x compute, and nvidia's cuda. If they wanted to use cuda that would obviously suit nvidia but really will MS ever go with anything but direct x compute, or sony with open cl?

Sony will go with whatever gets them closest 'to the metal' in terms of code optimizations. I'm sure OpenCL will be ported to the PS4, but Sony will encourage devs to use whatever is going to maximize performance. It doesn't matter if this is AMD, nVidia, PowerVR or Sony in house, they will push devs to use whatever gets them the optimal results. MS is the company that is going to push for ease of development.
 

zebrax2

Senior member
Nov 18, 2007
977
69
91
MS doesn't really have a say in that. Just like they cannot stop Cuda on Windows. If they go with nVidia, then the hardware will support Cuda, and every developer is then free to link to Cuda libraries, and will probably opt for this, as it is a more advanced solution (easier to develop with) and will deliver better performance, because it is tailor-made for the hardware.

This is different though. Xbox is MS closed system and if MS hire nVidia to create a GPU then were presented a GPU with a feature that they don't want in it they will surely not approve the design unless they disable the said feature.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
I have a question for people in this thread who is on the AMD side of the arguement. What if 6xxx supports PhysX, or something very similar to PhysX, will the hate towards GPU accelerated physics suddenly disappear because AMD now supports it?

Rumors stated that 6xxx will support 3d, if it is true, does Nvidia suddenly changed from only making proprietary stuff to do good deed for gaming since the beginning as all games/apps that supports 3d will work on AM3D, and many of them was actually aided by TWIMTBP on this. If it is true, then it will become easy to spot who is the little red riding hood as lots of people here in this forum claimed that 3d is only a gimmick. I can't wait to see their sudden change is opinions about it, very soon.

May be i am blind because i don't see the things others do. Is this really hard for AMD to support PhysX? Is it hard for AMD to contribute more on the GPU offload ideas instead of just let people to figure it out for them? On one hand some claimed that it is due to Nvidia's evil proprietary bussiness schema, yet the other hand where AMD can do it by themself is completely missing.
It is true that "AMD can't support physx because it is proprietary to Nvidia." It isn't true that "AMD can't do something as good as, or even better than Nvidia on GPU accelerated physics because Nvidia is evil."
 

zebrax2

Senior member
Nov 18, 2007
977
69
91
I have a question for people in this thread who is on the AMD side of the arguement. What if 6xxx supports PhysX, or something very similar to PhysX, will the hate towards GPU accelerated physics suddenly disappear because AMD now supports it?

Rumors stated that 6xxx will support 3d, if it is true, does Nvidia suddenly changed from only making proprietary stuff to do good deed for gaming since the beginning as all games/apps that supports 3d will work on AM3D, and many of them was actually aided by TWIMTBP on this. If it is true, then it will become easy to spot who is the little red riding hood as lots of people here in this forum claimed that 3d is only a gimmick. I can't wait to see their sudden change is opinions about it, very soon.

May be i am blind because i don't see the things others do. Is this really hard for AMD to support PhysX? Is it hard for AMD to contribute more on the GPU offload ideas instead of just let people to figure it out for them? On one hand some claimed that it is due to Nvidia's evil proprietary bussiness schema, yet the other hand where AMD can do it by themself is completely missing.
It is true that "AMD can't support physx because it is proprietary to Nvidia." It isn't true that "AMD can't do something as good as, or even better than Nvidia on GPU accelerated physics because Nvidia is evil."

No one really dislike GPU accelerated physics, what many dislike is the segregation created by GPU accelerated physx working only with or in tandem with nVidia GPUs.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
No one really dislike GPU accelerated physics, what many dislike is the segregation created by GPU accelerated physx working only with or in tandem with nVidia GPUs.
No, that is not the case. PhysX is attractive to game developers, it is free to implement, and not difficult to get support from Nvidia on implementation. However, people need to be paid, and therefore gamers need to pay for this one way or the other. If Nvidia put licenses on the implementation of PhysX, meaning the cost will be imbedded on the cost of the game itself. Unfortunately, since AMD does not support PhysX, it will not be attractive to developers that way.

Many sees license from Nvidia on PhysX means under Nvidia's control. That is false. Let say you brought a video card from AMD that supports PhysX, but have problems with games, will you believe that it is due to AMDs fail implementation? or Nvidia's little dirty tricks?

It should be clear that, although video cards more or less does the something, there are key difference within the design of the GPU, which can affect performance on different fields. The shader processor from AMD has a smaller physical size than cuda cores. Is it really hard to believe that they does not do every task the same way?

The key on whether AMD should acquire license from Nvidia isn't "Oh then AMD will be under Nvidia." In fact, the "bumpgate" incident clearly showed that if something is causing problems, its creator must be responsible. If there will be problems after AMD required PhysX from Nvidia, AMD will sue Nvidia without a second thought. Most importantly, people will automatically believe that it is some tricks from Nvidia. As you can see the risk on Nvidia far exceeded the rist on AMD shall AMD be able to acquire PhysX from Nvidia.

On the surface, AMD will only gain from it than lost. However, shader processors must be modified to support PhysX, which cause far more than its license fee. Note that PhysX support is one of the fundamental thing when designing CUDA core. This won't be the case for AMD, and therefore may not perform equally. Shall AMD pay for something and modify its core design at high cost, that will eventually underform from its competitor? As of now, AMD chooses not to. The reason can be, as some as pointed out, due to the lack of funding.

The other way to acquire such effect is to hire people who knows PhysX and recreate PhysY. That why, PhysY will be engineered on what shader process do best, and therefore minimize the cost of modifing the design and maximize its performance. The downsize of this approach is it won't work on games that currently support PhysX. Programming change may be very small, but may have to be build at game design phase and requires direct support from AMD. Even if AMD does this, it will be known as a copy of PhysX. This won't be better than aquiring PhysX directly in terms of prespective.

It is just business decisions. It depends on rather will benefit sales. There is no point to support PhysX if it does not benefit the value of the video card. That doesn't mean AMD or Intel should simiply ignore it shall they choice not to support. Something must be done. What should be done is the key to this problem.

If people can take off their colored glasses for a second, then people will realize that "gaming" really isn't worth to be mentioned when it comes to such decision making.

The whole idea about Fusion isn't because it will one day take over discrete video card, but the console box. It is logical, and far more benefical than supporting PhysX from its competitor. As long as PhysX runs on fusion, which it will, it really doesn't matter if GPU is accelerating it or not. It will be worth a talk, but not game breaking deals. Again, this isn't about gaming, it is about sales.

We spoke alot of rival advertising. Both vendors, as well as other companies use it. The down side of such advertisement is it will backfire, and when it does, it does much more damage and good to the company. Even if the advirtiser doesn't reveal its identity, it will be seen as a fanboy/fangirl very shortly. Unlike any individual, where we can jump ship at anytime, and say whatever we want, what they say are bounded by a cheque which makes them easy to be spotted. Now of course they can be working in groups to further hide their identities, but they can avoid being seen as a fan group.

Talks in this thread are more towards scali and keys than on the issue itself.
 

zebrax2

Senior member
Nov 18, 2007
977
69
91
1.) If AMD license PhysX i'm pretty sure that they'll rather port it to opencl or stream rather than copy nvidia's design to get cuda to work.
2.) They can't just point the finger on nVidia then sue them without any proof.
3.) They have another option in which they support another API which is what they are doing right now. Its not like every developer out there are already set in using PhysX that their only option is to use it or copy it.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
In the license it says changes can't be made unless nVidia approves. PhysX is using Cuda --- nVidia would have to approve or offer permission for AMD to port to OpenCL, from my understanding.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
It seems to me that Nvidia is starting to look like a sinking ship. And i say this because of the point i want to make, not because of the status per today. Who wants to get aboard a sinking ship?

In my opinion Intel and AMD have a "we both need eachother" approach in their market segments.
This will include graphics. If AMD needs or wants to push physics, it will likely be in cooperation with Intel. And both will benefit from it.
AMD may also go with Bullet to entice the public and get some more people interested in their hardware, and then ditch it completely.

Games like Crysis 2 etc are going to have influence on which direction AMD goes.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
No, that is not the case. PhysX is attractive to game developers, it is free to implement, and not difficult to get support from Nvidia on implementation. However, people need to be paid, and therefore gamers need to pay for this one way or the other. If Nvidia put licenses on the implementation of PhysX, meaning the cost will be imbedded on the cost of the game itself. Unfortunately, since AMD does not support PhysX, it will not be attractive to developers that way.

You seemed to ask a simple question of whether people want GPU physics just so you can go into the business practices of why you think AMD doesn't have it. I disagree with most of your assessment of why AMD doesn't have PhysX but none of that really addresses the question you asked.


"What if 6xxx supports PhysX, or something very similar to PhysX, will the hate towards GPU accelerated physics suddenly disappear because AMD now supports it?"

You clearly don't understand where anti-PhysX people are coming from but I'll answer this as someone who, in your eyes, 'hates GPU accelerated physics'. I do think GPU accelerated PhysX is mostly useless as it stands now but I do not hate GPU physics or GPU PhsyX. It has nothing to do with AMD fanboy-ism and everything to do with being a gamer.

Developers themselves have little interest in writing much more difficult GPU accelerated physics code right now due to the market base not being able to demand it because of IP limitations.

There's a reason nVidia engineers are the ones who have been adding the GPU accelerated PhysX effects to games after the fact. They already understand how to do it and it's in their interest to add it since they have the sole capability to process it. Unfortunately we end up with effects that often don't feel organic to the game because the game was already complete with physics before they were added.

When the nVidia engineers come in they look for a few effects they can add to showcase physics. We usually get a short checklist of the same effects (shells, dirt, flags) added while being incredibly inefficient codewise because they have the convenience of having 1.4 billion transistors with nothing to do but process some shells hitting the ground.

This makes it impossible to run these few effects as written on a CPU but they could be replicated convincingly enough (by using shortcuts for approximations like 0x5f3759df does for lighting) on a multi-core CPU if nVidia was trying to make efficient effects. And because nVidia doesn't have the capability to integrate the GPU effects more deeply into the game we will only ever see tacked on effects.


I would be pleased if AMD supported a physics solution - and it has nothing to do with AMD vs nVidia. It has to do with market uniformity so developers will have a platform worth spending the extra effort on. Just like I wanted nVidia to get DX11 after AMD hit the market with it. Now whether it's worth GPU processing time when multi-core CPU's are having more and more free resources to use for said effects is something I'm still unsure about.

The whole idea about Fusion isn't because it will one day take over discrete video card, but the console box. It is logical, and far more benefical than supporting PhysX from its competitor. As long as PhysX runs on fusion, which it will, it really doesn't matter if GPU is accelerating it or not.

When PhysX code is written to take advantage of parallel processing (GPU acceleration) it is a whole other set of code. You can't just run the CPU code on a GPU or the GPU core of Fusion. Even if you port it to CUDA or OpenCL so that it can you still have to rewrite it to take advantage of the parallelism.

So PhysX code running on Fusion will just be CPU PhysX unless PhysX is ported to OpenCL and the code is written to take advantage of parallel processing. If it's written in this way it would be able to run on discrete AMD Radeons as well as Fusion. So the issue is going to be licensing and control of IP not capability of hardware.


Of course it would be ridiculous for there *not* to be an OpenCL port of PhysX to use the extra resources available on Fusion but it remains to be seen if anyone is going to do that.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
...
You clearly don't understand where anti-PhysX people are coming from but I'll answer this as someone who, in your eyes, 'hates GPU accelerated physics'. I do think GPU accelerated PhysX is mostly useless as it stands now but I do not hate GPU physics or GPU PhsyX. It has nothing to do with AMD fanboy-ism and everything to do with being a gamer.
...
Anti-PhysX is a good word. You basically have 3 points. One is PhysX is mostly useless. The other one is it runs better on GPU than CPU and the actual coding on PhysX always get retrofitted at the end stage of the development cycle. The third is The other is game developers have no interest in implement PhysX as it isn't a generic language that will run on all platform. In other words, you don't like PhysX because it is proprietary.

Funny how you mentioned that PhysX takes the advantage of GPU, so how is it useless as often video card doesn't have much to do when playing console ports. Fact showed that PhysX don't bring FPS down much knowing the video card is actually doing graphics + PhysX. Another fact is when PhysX is ran on GPU, its performance is better than if it isn't. So how did you come to the conclusion that it is mostly useless? Are you trying to say that any GPU accelerated programs are useless?

Your 2nd claim is a direct contradiction to your 1st point. You know that there are things that GPU does better than CPU and PhysX shows that. You said that developers have little interest on spending much work on PhysX because of IP, and therefore limiting its market potential. However, you didn't mention that down from console box up to high end PC all are capable of running PhysX code. The only difference is, with Nvidia video cards, most of the load will be moved towards the video card instead of CPU. That means, not a single byte is wasted when Nvidia video card is not present, but non-Nvidia video cards will not be utilized. In other words, PhysX works on most platform with or without hardwares from Nvidia.

Check out Dragon Age from EA, the way it uses PhysX is different from others. No flying papers, but the way spells are displayed. You can freeze time and move the camera just to look at how beautiful spells are. Under 3D, you can clearly see spells are built by little particles surrounding the caster. This is one things that PhysX can do if the developers put hard in it. Again, people who are not using Nvidia video card will see the same effect, but with a Nvidia video card, weaker CPU can perform as well as better CPU. Otherwise, PhysX showed cutting edge graphics for everyone. So what is with the IP limitation again?

In short, PhysX is free and works on most platforms, and with a Nvidia video card, it actually utilizes those unused CUDA cores to maximize the potential of the video card and offloading CPU. I didn't make this up, this was the fundamental of PPU since the Ageia's time.

I mentioned Fusion and claimed that it is a good move because if it can dominate game console, then no one with console can see the game with Nvidia video card if they don't play the PC version, which is what happens now. Did you ever question why PS3 don't have GPU acceleration and how it will be better off if it does? No, because such option is not open. It is the manufacture of PS3 who decides which vendor to use, and they chose ATI.

Again, it is business decisions.
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Ok Seero.. you have your mind set on your own conclusion. Im happy for you.

As NoQuarter said, and it needs to be said again: PhycsX "haters" are not physics "haters". Haters is your word for it Seero, just using it to help you follow the argument.

Physics will never be the end and be all of a game. Gameplay is and will be.

If you can make a great game with great gameplay and at the same time incorporate physics effects which everybody can enjoy, you simply have a winning product.

PhyscX will not take off. Physics might.
 

amenx

Diamond Member
Dec 17, 2004
4,521
2,857
136
Originally Posted by Aristotelian
"Do we think that a thread like "Nvidia and AMD moral and immoral business practices" deserves its own thread? I think it does. A number of people have a lot to say about that, and perhaps we can put the halos in place, or bury them as is necessary after a topic like that."

I think this was a great idea and since we are knee deep in this discussion, I thought I'd just keep it going.

Special thinks to Aristotelian for the great idea.
I think Nvidia by far employs the most immoral, pathetic business practices to date in the GPU business. Buuttt... my last 5 or 6 cards have all been Nvidia, and I will continue to buy from them as long as I see they have the product that appeals to me most.

Moral or ethical issues in hardware choices amount to the biggest zero non-consideration in my book. Whats funny is I've seen this argument thrown around as justification for owning a particular hardware component (lets say GPU) while the person making the argument might have an Intel CPU. And as we all know, no one has engaged in dirtier business practices than Intel. They've even had to dish out millions to make amends to the competition. But its OK to own an Intel while they draw the line on Nvidia for moral or ethical considerations. Seriously, I have seen such people making such arguments.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Ok Seero.. you have your mind set on your own conclusion. Im happy for you.

As NoQuarter said, and it needs to be said again: PhycsX "haters" are not physics "haters". Haters is your word for it Seero, just using it to help you follow the argument.

Physics will never be the end and be all of a game. Gameplay is and will be.

If you can make a great game with great gameplay and at the same time incorporate physics effects which everybody can enjoy, you simply have a winning product.

PhyscX will not take off. Physics might.
PhysX took off for a while now.
Statistics done in 2009
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
PhysX took off for a while now. Statistics done in 2009

Titles are devided into cathegories, according to their metascore: Third rate or specific (with metascore <50 or not listed in metacritic.com database at all), Decent (metascore 50.. 70), Good (metascore 71.. 85) and Excellent (metascore >85).
titles_rating_graph.jpg
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Titles are devided into cathegories, according to their metascore: Third rate or specific (with metascore <50 or not listed in metacritic.com database at all), Decent (metascore 50.. 70), Good (metascore 71.. 85) and Excellent (metascore >85).

That doesn't contradict with what I said. PhysX has took off.
numb_released_titles.jpg

titles_release_dynamics_graph_year.png

Neither xbox360, PS3 or wii consist of Nvidia video card, but the platform distribution showed that PhysX are used on those platforms.
platform_distribution_graph.jpg


platform_distribution.png

Based on this graphs, can you say that PhysX is as proprietary as people said to be?
I won't say that PhysX will eventually wins out, but it is certainly taking off.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
I won't say that PhysX will eventually wins out, but it is certainly taking off.

I was not contradicting you infact, I agree with you that PhysX adoption has seen good growth in past 2 years but most of the games that use PhysX use are third rate.... which do not have high volume sales.

The article you posted has some very good info regarding the adoption of Physics technologies and they even explained why PhysX is gaining popularity, but we have no idea how many users playing the games actually USE these PhysX features.

Thanks to it&#8217;s free license and rich feature set PhysX SDK, preferred by small teams, is dominating PC market. Currently PhysX SDK is widely adopted by russian (mostly trash games) and korean (mostly specific MMOs) developers. Not to mention, that PhysX SDK is default physics solution for Unreal Engine 3, used in majority of UE3 based titles (Gears of War, Mass Effect, etc). Year 2009 has brought some popular games, like Dragon Age: Origins, Overlord 2 or Risen, into PhysX library.
It is clear that PhysX SDK is free to use for developers but that is not the case for the end users where the user will be locked out just because he does not have a Nvidia based card.

I am also not happy with the performance hit users get when PhysX is enabled. The effects are evident, but to experience these effects one has to sacrifice smooth playability.

I choose to have better gameplay experience over dismal performance when PhysX is enabled.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
You might want to read your own post clearly.

You might want to stop pulling things out of context, it's poor form.
Unless you want to argue that Microsoft would use Cuda as the criterion on which they choose a GPU (as in choosing against nVidia BECAUSE they have Cuda)?
But that's nonsensical enough to not even go there, so really I don't know where you are even trying to go with this, other than just trolling me yet again.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
So what is your take on Manju Hegde, Nvidia VP for PhsyX and CUDA Marketing and previously CEO and co-founder of PhysX inventor Ageia prior to its Nvidia buyout, has gone to AMD. http://www.bit-tech.net/news/hardware/2010/05/26/physx-founder-leaves-nvidia-for-amd/1

I have no idea why, and I think it's useless to speculate on it. Not sure where this would lead anyway (other than the fact that you seem to mostly try to interpret this as being a sign of nVidia treating him badly and such).
 

Scali

Banned
Dec 3, 2004
2,495
0
0
And Intel would be dead in the water without AMD64. It works both ways.

Not really, Intel still has regular x86 and Itanium. AMD literally has nothing if they don't have x86.

Could is far different that would, or does. Get this through your head, Nvidia does NOT offer any kind of cross licensing agreement for PhysX, and absolutely does NOT offer ANY agreement to AMD so their GPUs can accelerate PhysX. Period. Done.

Stop being so aggressive. I said could, and obviously I know the difference between could, would or does.

Dirk Meyer was asked directly during a conference call if the new agreement between AMD and Intel also included ATI's IP. His answer was a simple "yes". So again you are wrong, and did not bother to check the facts.

It includes some of ATi's IP (which Intel had already been licensing before ATi was part of AMD, just like Intel and pretty much all other IGP/GPU makers license technlogy from nVidia), but it doesn't give Intel full use of AMD's GPU technology. You will agree with me that this is not the same thing, right, mister could-would-does?
I know my facts, don't worry.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
This is different though. Xbox is MS closed system and if MS hire nVidia to create a GPU then were presented a GPU with a feature that they don't want in it they will surely not approve the design unless they disable the said feature.

You cannot disable Cuda, as Cuda *is* the GPU architecture.
Without Cuda, there is no OpenCL, no DirectCompute, and no SM5.0.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
3.) They have another option in which they support another API which is what they are doing right now. Its not like every developer out there are already set in using PhysX that their only option is to use it or copy it.

Thing is, nVidia acquired Ageia in February 2008. They then released the first working end-user runtime for GeForce cards in August 2008. So it took them about 6 months to add GPU-acceleration to PhysX from scratch.

AMD has now been 'working' with Havok for about 2 years, and they have nothing to show for it. Not even any beta drivers.
Common sense would dictate that they can't take this long to pull it off. Even if nVidia's development team is better, AMD has already taken 4 times as long, and still basically has nothing. There must be something else going on.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
It includes some of ATi's IP (which Intel had already been licensing before ATi was part of AMD, just like Intel and pretty much all other IGP/GPU makers license technlogy from nVidia), but it doesn't give Intel full use of AMD's GPU technology.
I know my facts, don't worry.
That's not what Dirk said, and no, you don't know your facts. What you consider facts is what fits your agenda. Dirk talked about sharing IP, which included ATI tech. This is part of the new agreement between AMD and Intel. It has nothing to do with IP Intel licensed from ATI before they were acquired by AMD.

Comprehension fail again on your part.