nVidia disables PhysX when ATI card present in Win7

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Qbah
Yes, it is a vendor lock-in. But it's their technology, they own it, it's not an open standard. They can do whatever they want with it.

That's not what they're claiming in that letter:

Hello JC,

Ill explain why this function was disabled.

Physx is an open software standard any company can freely develop hardware or software that supports it.

But you're right, it doesn't sound very open to me either.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
One day PhysX is irrelevant, the next people are up in arms at the miniscule % of people who run an ATI + Nvidia card not bring able to use the Nvidia card for PhysX.

 

AmberClad

Diamond Member
Jul 23, 2005
4,914
0
0
Originally posted by: BenSkywalker
nVidia's stand alone physics cards are reportedly still working with ATi boards so yes, it is very much exactly the same thing.
Where would I go to buy a standalone physics card these days, if I had an ATI video card and wanted that option? The ATI standalone tuners are a current product, and still very much in production and widely available.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: BenSkywalker

It worked with V2s without issue, nothing more recent.
V2s? You mean a Voodoo 2? LMFAO. You must be joking.

That?s a 3D-only video card; it doesn?t even have hooks into 2D, and hence has a completely different driver model to a card with a 2D engine. You simply cannot infer that just things worked with a Voodoo 2, that means it?ll work with another card. Just because something doesn?t work, that doesn?t mean the driver is actively blocking it.

Let?s try it another way: please provide evidence that ATi?s drivers are actively blocking features when they detect a competitor?s card in your system.

nV GPU for PhysX with an ATi GPU has never been supported by nVidia officially.
No one was claiming it was supported officially. The claim was that it worked before, and now it?s being actively blocked when a competitor?s card is in the system.

This is exactly the same thing that happened with SLI on non-nVidia chipsets, and CF on nVidia chipsets. It once worked, and then it stopped working because nVidia started blocking it.

That would be BIOS level, and up to the vendors.
No, it?s not up to the vendors. ULi was shut down by nVidia because they did precisely what you claim and were releasing motherboards that supported SLI and CF. nVidia is controlling things here, not the OEM.

Again, show me recent benchmarks of CF running on nVidia chipsets and I?ll be happy to look at them. The fact is you can?t, even though an nVidia chipset that supports CF would be a tremendous advantage over other nVidia chipsets that can?t, and any OEM would jump at the chance to implement such a feature if they could.

Also:

http://arstechnica.com/hardwar...tel-x58-mainboards.ars

NVIDIA announced has agreed to authorize support for its SLI technology specifically on Intel X58 motherboards. This authorization does not require the use of NVIDIA's nForce 200 chip, reports Digitimes.
Why do we need authorization from nVidia if all we need is the right BIOS from vendors like you claim?

Where are all those SLI setups running on Intel chipsets prior to nVidia authorizing it? Or what, are you claiming no OEM had an interest in releasing a BIOS that allowed SLI on Intel chipsets before this announcement? If you are, that?s simply nonsensical.

Intel also had issues with CF boards despite following specs, they worked out the issues through the BIOS and got it working however.
And?

nVidia's stand alone physics cards are reportedly still working with ATi boards so yes, it is very much exactly the same thing.
So? What does any of this have to do with Amber?s example disproving your claims that ATi?s drivers are disabling functionality on TV Tuner cards like you claim?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Genx87
One day PhysX is irrelevant, the next people are up in arms at the miniscule % of people who run an ATI + Nvidia card not bring able to use the Nvidia card for PhysX.
PhysX is currently as irrelevant to me as CF and SLI. But that doesn?t mean that I don?t recognize the importance of competition and choice being good for the consumer.

Again, what would you do if Intel did this with Havok? Build two separate rigs, one to run Havok and the other to run PhysX? That?s exactly what you had to do in the past if you wanted to go from SLI to CF, or vice versa.

Artificial vendor lock-in, when done for no good reason other than to restrict the competition, is bad for the customer.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: BFG10K
Originally posted by: Genx87
One day PhysX is irrelevant, the next people are up in arms at the miniscule % of people who run an ATI + Nvidia card not bring able to use the Nvidia card for PhysX.
PhysX is currently as irrelevant to me as CF and SLI. But that doesn?t mean that I don?t recognize the importance of competition and choice being good for the consumer.

Again, what would you do if Intel did this with Havok? Build two separate rigs, one to run Havok and the other to run PhysX? That?s exactly what you had to do in the past if you wanted to go from SLI to CF, or vice versa.

Artificial vendor lock-in, when done for no good reason other than to restrict the competition, is bad for the customer.

There are lockins in every market. It is a tool by manufacturers to keep you as a customer. I cant buy an AMD chip and use an Intel chipset. I cant buy an Audi and get OnStar. My wife uses Sandisk Rhapsody that only works with Sandisk MP3 players. Apples iTune works best with Apple products. If we got up in arms about every lock in we would lose our mind. This isnt uncommon and it wont be the last time a vendor does this to keep customers.

The amount of people this affects is so small only on this msgboard can it generate 6 pages of faux outrage. I got the impression early the Physics on the GPU was going to be a lock in program. When ATI and nvidia hyped it up over the PPU in ~06. Both companies made it very clear they wanted you to use a second video card from the same manufacturer dedicated to physics. It is a marketing tool right now to drive second video card sales for the same box.

It sucks for people using an ATI card but it doesnt surprise me in the least. What does Nvidia get out of you paying ATI top dollar for a graphics card and then buying the sub 100 dollar Nvidia card to generate physics?

And I wont be surprised in the least when\if Intel delivers Havok on a GPU it locks out the competition as well, possibly even ATI.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Genx87
There are lockins in every market. It is a tool by manufacturers to keep you as a customer. I cant buy an AMD chip and use an Intel chipset. I cant buy an Audi and get OnStar. My wife uses Sandisk Rhapsody that only works with Sandisk MP3 players. Apples iTune works best with Apple products. If we got up in arms about every lock in we would lose our mind. This isnt uncommon and it wont be the last time a vendor does this to keep customers.

Your examples are flawed. A closer analogy would be if you had purchased an Audi and paid for OnStar, but then one day Audi said, "We've had a falling out with OnStar, so we're going to disable OnStar in all our new and used vehicles. No, there's nothing wrong with the hardware itself, we just don't want you using it anymore."


Originally posted by: Genx87
It sucks for people using an ATI card but it doesnt surprise me in the least. What does Nvidia get out of you paying ATI top dollar for a graphics card and then buying the sub 100 dollar Nvidia card to generate physics?

A higher PhysX adoption rate, for one.


I swear, Nvidia is run by temperamental 12 year olds. I've never heard of a company that makes this many controversial marketing decisions.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Your examples are flawed. A closer analogy would be if you had purchased an Audi and paid for OnStar, but then one day Audi said, "We've had a falling out with OnStar, so we're going to disable OnStar in all our new and used vehicles. No, there's nothing wrong with the hardware itself, we just don't want you using it anymore."

My analogy is fine. OnStar is a value add lock-in for GM products. I have no way to purchase OnStar for an Audi. Lock-ins are a given in any market.

A higher PhysX adoption rate, for one.

Considering their market in stand alone GPU's. The amount of people who purchase an ATI card with an Nvidia card wont make a whole lot of difference.

I swear, Nvidia is run by temperamental 12 year olds. I've never heard of a company that makes this many controversial marketing decisions.

You ever take a business or marketing class? Lock-ins are part of the curriculum to advance your business. From a business perspective right now it makes sense for Nvidia to disable the ability to have PhysX with the a competitors card that has no plans to support your standard. Until Microsoft gets off their ass or Nvidia's marketshare plummets to ATI lebels we will see these types of marketing games.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: BFG10K
Originally posted by: SirPauly

What do we really know? Ben Skywalker did bring up a solid point.
What are you talking about? We know PhysX has absolutely nothing to do with ATi cards given there?s no current mechanism to run it on them.

It may be technical in nature and nVidia may feel they can't invest the resources to make sure current and future PhysX content runs smoothly on other IHV's GPU's.
There?s no such requirement given PhysX doesn?t run on other IHV?s GPUs. You?re arguing a scenario that doesn?t even exist and trying to propose it?s the ?reason?.

And even if it was (which it isn?t), we?re talking about running PhysX on nVidia GPUs anyway.

Here is Ben' s quote:

Originally posted by: BenSkywalker
Are the Ageia PPUs still working? Can you dedicate one card from a SLI setup to PhysX anymore?

I'm looking over some things coming down the pipe for PhysX, seems like it would be a nightmare to have cross chip communication involved.

There may be examples with future content of PhysX may take developing expense and nVidia may not desire to spend resourses here to make sure they run fine with ATI products in general.

I'm not going to simply rule it out and blanket agenda and just leverage without knowing more data. I'm not that smart to do this. Before I point fingers personally need more data, sorry. There may be many layers to this decision.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Genx87
My analogy is fine. OnStar is a value add lock-in for GM products. I have no way to purchase OnStar for an Audi. Lock-ins are a given in any market.

Yes, but PhysX was already working in Win7 with an ATI card installed in the system. Nvidia introduced an artificial limitation. Substitute "GM" for "Audi" in my previous analogy if you like. The hardware is there, it was previously working, but one company decided it didn't want you using it anymore.


Originally posted by: Genx87
Considering their market in stand alone GPU's. The amount of people who purchase an ATI card with an Nvidia card wont make a whole lot of difference.

So why did Nvidia spend the time and money purchasing Aegia and developing PhysX if they don't care if it gets adopted or not? PhysX hasn't exactly lived up to hype it was released to. Nvidia hardly seems to be in the position to be cutting off potential users if they expect it to amount to anything.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
So why did Nvidia spend the time and money purchasing Aegia and developing PhysX if they don't care if it gets adopted or not? PhysX hasn't exactly lived up to hype it was released to. Nvidia hardly seems to be in the position to be cutting off potential users if they expect it to amount to anything.

Do you really believe this combo of cards represents a sizeable market to affect physx adoption? If so, why? Nvidia holds top crown 2:1 in stand alone device marketshare. The only thing holding them back right now is game developers pushing out code.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Originally posted by: Genx87
Your examples are flawed. A closer analogy would be if you had purchased an Audi and paid for OnStar, but then one day Audi said, "We've had a falling out with OnStar, so we're going to disable OnStar in all our new and used vehicles. No, there's nothing wrong with the hardware itself, we just don't want you using it anymore."

My analogy is fine. OnStar is a value add lock-in for GM products. I have no way to purchase OnStar for an Audi. Lock-ins are a given in any market.

A higher PhysX adoption rate, for one.

Considering their market in stand alone GPU's. The amount of people who purchase an ATI card with an Nvidia card wont make a whole lot of difference.


I swear, Nvidia is run by temperamental 12 year olds. I've never heard of a company that makes this many controversial marketing decisions.

You ever take a business or marketing class? Lock-ins are part of the curriculum to advance your business. From a business perspective right now it makes sense for Nvidia to disable the ability to have PhysX with the a competitors card that has no plans to support your standard. Until Microsoft gets off their ass or Nvidia's marketshare plummets to ATI lebels we will see these types of marketing games.


All business companies are there to make profit so opening up PhysX support to non Nvidia cards can't be a bad thing since in the long run it would help Nvidia ie sales would increase due to the fact Nvidia/ATI combo for PhysX,also adoptation of PhysX would probably improve, as I have previously stated in an earlier post Nvidia don't lose anything and it can only improve PhysX support in the long run, they still get their money since you still need to buy or have previously purchased a PhysX card,by restricting PhysX support all they are doing is giving the competitors more reason to go with their own version ,gamers are left wondering which way to turn,how important does Nvidia want PhysX to become in the gaming industry?..sure there are loads of game titles for PhysX but they're usless if you don't use Nvidia only cards,what are Nvidia afraid of?....are they really wanting to protect PhysX from the rest of the market and competitors?

Why don't they just release PhysX card for any brand of video card like the old days?....end of the day they are there to make money,more PhysX cards that are sold can only be good for them so open up the market Nvidia if you want PhysX to get some sort of momentum to all the gamers out there.
As a gamer I like to have choice with no restrictions.

















 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: Creig
Originally posted by: Genx87
My analogy is fine. OnStar is a value add lock-in for GM products. I have no way to purchase OnStar for an Audi. Lock-ins are a given in any market.

Yes, but PhysX was already working in Win7 with an ATI card installed in the system. Nvidia introduced an artificial limitation. Substitute "GM" for "Audi" in my previous analogy if you like. The hardware is there, it was previously working, but one company decided it didn't want you using it anymore.


Originally posted by: Genx87
Considering their market in stand alone GPU's. The amount of people who purchase an ATI card with an Nvidia card wont make a whole lot of difference.

So why did Nvidia spend the time and money purchasing Aegia and developing PhysX if they don't care if it gets adopted or not? PhysX hasn't exactly lived up to hype it was released to. Nvidia hardly seems to be in the position to be cutting off potential users if they expect it to amount to anything.

How to you know factually - without doubt -- end-all-be-all that it's just an artificial limitation?

I'm not saying it isn't because I really don't know for sure because there is limited data. On the surface, can certainly understand the point "leverage" but these descisions may have many layers to them and open minded to try to learn and listen first.

It takes incredible gifts to talk fact without knowing all the facts.

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Genx87
The only thing holding them back right now is game developers pushing out code.

So if they're already having difficulty getting game developers to use PhysX, how does this help?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Genx87

I cant buy an AMD chip and use an Intel chipset.
That?s because there are very good reasons why you can?t, the main one being the physical sockets aren?t compatible. Also AMD chips never worked on Intel motherboards to begin with, so it?s not like a BIOS update came along and purposefully disabled them for no good reason.

I cant buy an Audi and get OnStar. My wife uses Sandisk Rhapsody that only works with Sandisk MP3 players. Apples iTune works best with Apple products.
Again, did any of these scenarios work before, only to be artificially blocked later?

If we got up in arms about every lock in we would lose our mind.
We aren?t talking about every lock-in; we?re talking about previously working scenarios being purposefully disabled for no good reason other than to restrict the consumer. Like when the ULi SLI patch was disabled at nVidia?s driver level.

PCIe is an open standard. There?s nothing inherently special about nVidia?s PCIe slots that made them suitable for SLI while other PCIe slots weren?t. ULi?s solution proved that. nVidia?s actions were artificial vendor-lock, plain and simple.

What does Nvidia get out of you paying ATI top dollar for a graphics card and then buying the sub 100 dollar Nvidia card to generate physics?
$100 bucks more than they will now from someone buying the same ATi card.

And I wont be surprised in the least when\if Intel delivers Havok on a GPU it locks out the competition as well, possibly even ATI.
That will be a very sad day for consumers if that happens. nVidia?s current action impacts a small amount of customers, but locking Havok like that would have a hugely detrimental impact.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: SirPauly
How to you know factually - without doubt -- end-all-be-all that it's just an artificial limitation?

I'm not saying it isn't because I really don't know for sure because there is limited data. On the surface, can certainly understand the point "leverage" but these descisions may have many layers to them and open minded to try to learn and listen first.

It takes incredible gifts to talk fact without knowing all the facts.

I don't know without a doubt, I never said I did. I'm basing my opinion on the same piece of evidence that is the basis for this entire thread, the letter to JC from Troy in NVIDIA Customer Care. If it turns out to be false, then obviously none of this discussion will matter whatsoever.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: Creig
Originally posted by: SirPauly
How to you know factually - without doubt -- end-all-be-all that it's just an artificial limitation?

I'm not saying it isn't because I really don't know for sure because there is limited data. On the surface, can certainly understand the point "leverage" but these descisions may have many layers to them and open minded to try to learn and listen first.

It takes incredible gifts to talk fact without knowing all the facts.

I don't know without a doubt, I never said I did. I'm basing my opinion on the same piece of evidence that is the basis for this entire thread, the letter to JC from Troy in NVIDIA Customer Care. If it turns out to be false, then obviously none of this discussion will matter whatsoever.

But if you read that he clearly offers developer expense and customer assurance issues and yet this is quickly dismissed and people lock on to just "leverage!"

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: SirPauly

There may be examples with future content of PhysX may take developing expense and nVidia may not desire to spend resourses here to make sure they run fine with ATI products in general.
But we?ve established that a PPU and a dedicated SLI card both work, so I really don?t see where you?re going with this.

And again, PhysX doesn?t ?run with ATi products? because it has absolutely nothing to do with them. The hardware acceleration is done on nVidia?s video card.

Tell me, do you also expect a system with GMA + nVidia to also disable PhysX? Because that?s a pretty large chunk of the market, and it?s the same situation of two GPUs from two different vendors.

I actually have such a configuration (GMA + GTX285), so would you expect PhysX to stop working for me since there?s a non-nVidia GPU in my system?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: BFG10K
Originally posted by: SirPauly

There may be examples with future content of PhysX may take developing expense and nVidia may not desire to spend resourses here to make sure they run fine with ATI products in general.
But we?ve established that a PPU and a dedicated SLI card both work, so I really don?t see where you?re going with this.

And again, PhysX doesn?t ?run with ATi products? because it has absolutely nothing to do with them. The hardware acceleration is done on nVidia?s video card.

Tell me, do you also expect a system with GMA + nVidia to also disable PhysX? Because that?s a pretty large chunk of the market, and it?s the same situation of two GPUs from two different vendors.

I actually have such a configuration (GMA + GTX285), so would you expect PhysX to stop working for me since there?s a non-nVidia GPU in my system?

Personally desire for nVidia to expand on the developing, assurance and business points raised. I didn't invent it -- an nVidia spokesperson offered. The key for my mind-set is not to dismiss any of them until there is further data.

I don't know how nVidia handles the drivers for their entire family of products. I don't know what nVidia is testing to come to the developing and assurance points. I don't know how future PhysX content is effected.

nVidia already has strong leverage with GPU PhysX and the ability to offer with a single GPU -- ATI doesn't care about it and enjoys the word "death" hehe! It seems not lifting a finger. It would make more sense to leverage PhysX for ATI platforms with a PhysX discrete card to make more end-users aware and more systems that may offer this for games. But how much would this cost nVidia to do this? Is it easy to do or difficult? Does anyone really know? I have many questions.



 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Creig
Originally posted by: Genx87
The only thing holding them back right now is game developers pushing out code.

So if they're already having difficulty getting game developers to use PhysX, how does this help?

Who said they are having difficulty?
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Originally posted by: Creig
I don't know without a doubt, I never said I did. I'm basing my opinion on the same piece of evidence that is the basis for this entire thread, the letter to JC from Troy in NVIDIA Customer Care. If it turns out to be false, then obviously none of this discussion will matter whatsoever.

Pretty sure it's true, this just appeared on the Nvidia PhysX faq page:

Can I use an NVIDIA GPU as a PhysX processor and a non-NVIDIA GPU for regular display graphics? No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics.

Really Nvidia? There are multiple technical connections between PhysX processing and graphics? You aren't the ministry of truth; people aren't going to forget about standalone PhysX cards where these 'multiple technical connections' didn't exist, or that ATI for rendering + Nvidia for PhysX used to work.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Where would I go to buy a standalone physics card these days, if I had an ATI video card and wanted that option?

Call up ATi and ask them about that. Ultimately it was there choice not to support the API on their GPUs in the first place. Besides that though it isn't hard at all to find them.

That?s a 3D-only video card; it doesn?t even have hooks into 2D, and hence has a completely different driver model to a card with a 2D engine. You simply cannot infer that just things worked with a Voodoo 2, that means it?ll work with another card. Just because something doesn?t work, that doesn?t mean the driver is actively blocking it.

It's the same hardware that is on their stand alone TV tuner boards that works just fine with other vendors GPUs. Given we are talking about mixing differing functionality across multiple GPUs we are talking about the same thing here.

No one was claiming it was supported officially. The claim was that it worked before, and now it?s being actively blocked when a competitor?s card is in the system.

At this point it is working with PPUs.

No, it?s not up to the vendors. ULi was shut down by nVidia because they did precisely what you claim and were releasing motherboards that supported SLI and CF. nVidia is controlling things here, not the OEM.

You have a good point, nV stopping other hardware from running on their platform would be rather underhanded if they were to exploit legal pressures and pull licensing agreements out from under their partners. Kind of exactly like what AMD did to nVidia. You are heading in a different direction on this topic then the original claims which had to do with following PCI-E specs.

Why do we need authorization from nVidia if all we need is the right BIOS from vendors like you claim?

Where did all the nV AMD chipsets go? Seriously man, double standard much? On a technical basis it can work as the chipset supports it and the hardware does, it is a limitation of the BIOS.

But that doesn?t mean that I don?t recognize the importance of competition and choice being good for the consumer.

AMD shutting nV out of their chipset business is OK, and ATi refusing to support the standard when given the option is OK, but nV not allowing a potential nightmare on the driver side to function isn't OK? Hypocrisy running a bit on the steep side today?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: HurleyBird
Originally posted by: Creig
I don't know without a doubt, I never said I did. I'm basing my opinion on the same piece of evidence that is the basis for this entire thread, the letter to JC from Troy in NVIDIA Customer Care. If it turns out to be false, then obviously none of this discussion will matter whatsoever.

Pretty sure it's true, this just appeared on the Nvidia PhysX faq page:

Can I use an NVIDIA GPU as a PhysX processor and a non-NVIDIA GPU for regular display graphics? No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics.

Really Nvidia? There are multiple technical connections between PhysX processing and graphics? You aren't the ministry of truth; people aren't going to forget about standalone PhysX cards where these 'multiple technical connections' didn't exist, or that ATI for rendering + Nvidia for PhysX used to work.

Now, if we can define multiple technical connections between PhysX processing and graphics.
 

AmberClad

Diamond Member
Jul 23, 2005
4,914
0
0
Originally posted by: BenSkywalker
Besides that though it isn't hard at all to find them.
I meant a reputable, mainstream eTailer. Not a few cards scattered here and there at sketchy places like CompuVest.

I consider it a de facto lockout when someone in search of an alternative has to scrounge around for a used PhysX card or hunt around at questionable vendors for leftovers. At that point, it's almost moot whether or not those EOL parts are locked out as well.

So, back to that point you were attempting to make about how ATI is supposedly trying to block their tuners from working with Nvidia video cards?