nVidia disables PhysX when ATI card present in Win7

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

yusux

Banned
Aug 17, 2008
331
0
0
Originally posted by: thilan29
http://www.ngohq.com/graphic-c...i-card-is-present.html

That sucks...the most they should have done is say "we won't support it" and just leave it to people to hack n slash physx if they really wanted to. I had it working with a spare 8800GT in Mirror's Edge and some tech demos so it does work sometimes in Win7. Some posters are actually questioning the legality of this...could this actually be illegal?

I wonder if this will affect people who run multiple cards for multiple monitors (ie. someone who wants to run PhysX that has multiple monitors)?



When u bought the 8800 gt, did they advertise it was physx capable anywhere on the box and on the website you bought it from? If they did, then this is probably illegal to somepoint as it is a form of false advertisement.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Elfear
Originally posted by: akugami
Originally posted by: Keysplayr
I think Qbah hit it right on the head. A dedicated Nvidia GPU running PhysX in a primary ATI system just "happened" to work. It was never intended to work nor was it intended to be supported. This next step just simply forced it off.

Very very possible but I think that nVidia caught wind of it and seems to be actively making sure that it doesn't work instead of just giving it a nod and wink and saying that officially it's not supported.

Again, it's only my opinion but I think that was a stupid move on their part to actively exclude an ATI+nVidia combo.

+1

The whole thing just leaves a sour taste in my mouth with the way Nvidia does business. I've had just as many Nvidia cards over the years (maybe more) than I have ATI cards since bang for the buck trumps brand loyalty in my book. However, with the way I see Nvidia running things lately, if both Nvidia and ATI offered cards that performed and cost the same, I'd probably pick the ATI card. I'm probably not alone in that sentiment either and it just strikes me that Nvidia isn't winning many customers with the business strategy they've taken.

Hehe, I see Intels business practices are cool with you. Nice i7 you have there. :) :evil:

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: yusux
Originally posted by: thilan29
http://www.ngohq.com/graphic-c...i-card-is-present.html

That sucks...the most they should have done is say "we won't support it" and just leave it to people to hack n slash physx if they really wanted to. I had it working with a spare 8800GT in Mirror's Edge and some tech demos so it does work sometimes in Win7. Some posters are actually questioning the legality of this...could this actually be illegal?

I wonder if this will affect people who run multiple cards for multiple monitors (ie. someone who wants to run PhysX that has multiple monitors)?



When u bought the 8800 gt, did they advertise it was physx capable anywhere on the box and on the website you bought it from? It they did then this is probably illegal to somepoint as it is a form of false advertisement.

It's also a DirectX capable card as well. But run it in a Linux system.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: HurleyBird
Now you're just trolling, Keyes.

In your eyes I'm trolling. I know I'm not. He is basically saying that the business practices of a company has an effect on the product he would buy. I don't buy it. Everyone here pretty much knows what Intel has done as a company and what kind of business practices they employed over the years. Yet he still found a way to forgive all that and go with an i7 setup. Am I still trolling here HurleyBird? Or is what I'm saying here something you have not considered, or have but overlooked?

Contribute or don't.

 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Originally posted by: SlowSpyder
Maybe if you uninstalled the Catalyst IGP drivers (I've never used an AMD chipset with with IGP, are the drivers part of the chipset for IGP, or the regular Catalyst install?) than disabled the 'unknown' device in Device Manager would the Nvidia drivers be smart enough to still know the AMD device is there?

Or could you turn off the IGP in the bios?

I guess it looks like losing that IGP (say you had multiple monitors) is just the cost of using Physx in that scenario..?
Of course a person who builds a system on her/his own would know how to get in the BIOS and shut IGP off. I'm just imagining a theoretical situation.

Windows 7 comes with WDDM 1.1 compliant drivers from both AMD and NV. Last time I checked they were drivers published in Apr 2009, Catalyst 9.4 for AMD and ForceWare 181.72 for NVIDIA, respectively. Any card that's supported by those sets of drivers will be automatically recognized by Windows.

So when you have cards from both AMD and NV, chances are both cards are supported by vendor-specific WDDM 1.1. And if you install one vendor's stand-alone driver package, it can certainly tell whether the other vendor's GPU exists. (instead of a 'generic VGA')

Or imagine a situation where a user has 3 GPUs on an X58 board and 2 monitors. 2 x NV cards for a primary monitor, and 1 x AMD card for a secondary monitor. And of course the user's intention of the 2nd NV card there is for PhysX acceleration. Possible?

Again, just a thought experiment..
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Originally posted by: Keysplayr
Originally posted by: HurleyBird
Now you're just trolling, Keyes.

In your eyes I'm trolling. I know I'm not. He is basically saying that the business practices of a company has an effect on the product he would buy.

Uh huh...

Originally posted by: Elfear if both Nvidia and ATI offered cards that performed and cost the same, I'd probably pick the ATI card.

He's saying that, everything else being equal (equal is the operative word) he would rather buy an ATI card because of Nvidia's business practices. Again, you decided to build up a very weak straw man argument. He has an i7 (from the oh-so-evil Intel corporation) therefore he really doesen't care about business practices after all! The only problem is that AMD doesen't have anything that even comes close to performing like an i7, does it? How is that exactly "everything else being equal"? That's right, it's not!

You're obviously trolling. And you're being dishonest in telling us that you aren't trolling when you know you are. And you're some kind of mod, huh?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Qbah

In the email the nhq-something guy got from nVidia it was said that since they've been doing all the research, testing, driver writing and all related actions with PhysX, they have the right to select what it can work with - and since they're also the biggest GPU maker, they made it so that only when an nV GPU is doing the rendering, PhysX can be utilized. It's a very direct but understandable stance.
Exactly the same thing applies to Havok. So then, would everyone be happy if Intel disabled it if a non-Intel CPU or a non-Intel GPU is detected?

PhysX already runs on nVidia video cards. Artificially disabling it when competition is present is simply vendor-lock in. This is exactly what was happening with SLI back in the day. There was absolutely nothing special about nVidia?s PCIe or northbridge that made SLI impossible on other chipsets.

Does CF even work on nVidia motherboards, or is it still blocked (honest question as I?m not sure)? If it?s blocked then again, what?s the point of having standards like PCIe in the first place if nVidia aren?t going to follow them?

Also, nobody ever said it will be supported, it just happened to work.
If the card?s box says it supports PhysX then it supports PhysX. Having other components in the system shouldn?t make a difference.

What?s next, disabling a NIC?s functionality if it detects another NIC in the system?
 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
I'm with keys on this one, we call this business fellas. What Nvidia is doing might be pathetic, but they're getting a lot of heat from AMD's current offerings and I can understand their concern. NO I'm not a nVidia fanboy apologist, I've been an ATI fanboy forever Keys and Rollo can vouch for me. :laugh:
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,231
3,132
146
This is most unfortunate news. I think I will try to stay away from nvidia for a while because of this. Pitty EVGA isn't making Radeon cards. Yet.

Of course, I just MIGHT get one more 260 for SLI :D
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's easy to rush to judgement on this and say nVidia is trying to leverage PhysX but if a company is trying to push and showcase GPU Physics - this doesn't make a lot of sense. They already have strong leverage by having the ability to offer GPU Physics with a single GPU.

It may be technical in nature and nVidia may feel they can't invest the resources to make sure current and future PhysX content runs smoothly on other IHV's GPU's. I'm going to be open minded on this possibility. A rush to judgement with limited data is easy to do on forums. There could be many layers to this decision.

It's also easy to point fingers at just nVidia -- but ATI could adopt PhysX but we know how they feel about PhysX. So, there is at least slices to the blame pie.

I don't know what the facts and details truly are and the real motivation behind this move.

Even so, it does suck that an ATI user can't enjoy PhysX if they desired to.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
If the card?s box says it supports PhysX then it supports PhysX. Having other components in the system shouldn?t make a difference.

I have an All In Wonder here that has a TV tuner and video capture hardware on it, never would work with any other vid card in any PC, same thing?

Does CF even work on nVidia motherboards, or is it still blocked (honest question as I?m not sure)?

When did it get blocked and by whom? I know Crossfire worked on NF4SLI mobos back in the x1800xt days. ASRock was demoing an NFi740 board with Crossfire running also, although I honestly don't know if that ever got official support or not. If it is blocked by the drivers, it would be ATi doing it, not nVidia. The BIOS would need to have support added, which would be up the OEM.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
I'm personally not passing any judgment on this because 1) I do not know the exact reason (or even a highly probable reason) why NV made this decision, and 2) it's probably not a big deal at least for the near future.

Both reasons have been mentioned in this thread, and I think it's definitely possible that there is a technical reason for NV to make such a decision. Unlike SLI which we knew that all it takes is two PEG's, we're talking about a brand new driver model where different drivers co-existing in Windows environments. Until we gather more knowledge on WDDM's inner working, we don't know for sure whether AMD rendering + NV physics is a stable configuration.

And as many have commented, we don't know how GPU physics will pan out yet. It's at its infancy, and it's by all means possible NV's PhysX might not even become popular enough. Not that I hope so, but I'm pointing out its unpredictable future.

Of course, above reasons won't stop me from crying foul on NV in the future if a strong evidence surfaces that this was its pure business tactic, but right now I think this issue is way overblown for what it's worth.
 

Blazer7

Golden Member
Jun 26, 2007
1,136
12
81
Originally posted by: Keysplayr
Originally posted by: HurleyBird
Now you're just trolling, Keyes.

In your eyes I'm trolling. I know I'm not. He is basically saying that the business practices of a company has an effect on the product he would buy. I don't buy it. Everyone here pretty much knows what Intel has done as a company and what kind of business practices they employed over the years. Yet he still found a way to forgive all that and go with an i7 setup. Am I still trolling here HurleyBird? Or is what I'm saying here something you have not considered, or have but overlooked?

Contribute or don't.


Corporations can be ruthless, no question about that but that's not always enough. Once every now and then they have to act smart too and nV right now is playing dumb. PhysX is a long way from being considered a must to gamers and cutting off a portion of nV HW owners will only create ire among them (which is the case right now).

What can really make things worse is competition. Right now nV has no real competition over physics but cutting out AMD systems may just do the trick of pushing AMD and ?others? in the direction of getting their own tech off the ground.

Talking about a shot in the foot.
 

thilanliyan

Lifer
Jun 21, 2005
12,062
2,275
126
Originally posted by: SirPauly
It may be technical in nature and nVidia may feel they can't invest the resources to make sure current and future PhysX content runs smoothly on other IHV's GPU's.

It isn't running on another IHV's GPU. It's still running on an nVidia GPU. Also, the Ageia cards worked fine regardless of the GPU in the system so we know it works...it just seems like it's being blocked artificially like SLI was.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: Keysplayr
Originally posted by: Elfear
Originally posted by: akugami
Originally posted by: Keysplayr
I think Qbah hit it right on the head. A dedicated Nvidia GPU running PhysX in a primary ATI system just "happened" to work. It was never intended to work nor was it intended to be supported. This next step just simply forced it off.

Very very possible but I think that nVidia caught wind of it and seems to be actively making sure that it doesn't work instead of just giving it a nod and wink and saying that officially it's not supported.

Again, it's only my opinion but I think that was a stupid move on their part to actively exclude an ATI+nVidia combo.

+1

The whole thing just leaves a sour taste in my mouth with the way Nvidia does business. I've had just as many Nvidia cards over the years (maybe more) than I have ATI cards since bang for the buck trumps brand loyalty in my book. However, with the way I see Nvidia running things lately, if both Nvidia and ATI offered cards that performed and cost the same, I'd probably pick the ATI card. I'm probably not alone in that sentiment either and it just strikes me that Nvidia isn't winning many customers with the business strategy they've taken.

Hehe, I see Intels business practices are cool with you. Nice i7 you have there. :) :evil:

What has Intel got to do with a topic about PhysX? You can run SLI on Intel and CF too - if anything, nVidia should learn from them! This was such blatant trolling and bad try to rebound it's sad.

Originally posted by: Keysplayr
It's also a DirectX capable card as well. But run it in a Linux system.

And what has this to do with PhysX too? DirectX is a standard running on the Windows Gaming Platform. You're now trying to start the same sh!t Wreckage was doing? It's as relevant as buying a gasoline car and and then being upset that it doesn't run diesel! Totally irrelevant and, frankly speaking, silly.


EDIT: So that we're clear, I understand the decision behind this and don't really have a problem with it But please, you're a spokesman for them, use some good arguments and not the things I quoted...
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: BFG10K
Originally posted by: Qbah

In the email the nhq-something guy got from nVidia it was said that since they've been doing all the research, testing, driver writing and all related actions with PhysX, they have the right to select what it can work with - and since they're also the biggest GPU maker, they made it so that only when an nV GPU is doing the rendering, PhysX can be utilized. It's a very direct but understandable stance.
Exactly the same thing applies to Havok. So then, would everyone be happy if Intel disabled it if a non-Intel CPU or a non-Intel GPU is detected?

Well, those are assumptions. There will be a discussion if Intel would do it. They didn't and it doesn't seem they will. So that's kinda a moot point right now :)

PhysX already runs on nVidia video cards. Artificially disabling it when competition is present is simply vendor-lock in. This is exactly what was happening with SLI back in the day. There was absolutely nothing special about nVidia?s PCIe or northbridge that made SLI impossible on other chipsets.

Yes, it is a vendor lock-in. But it's their technology, they own it, it's not an open standard. They can do whatever they want with it. And people will follow (or in this case, won't :p) with their wallets.

Also, nobody ever said it will be supported, it just happened to work.
If the card?s box says it supports PhysX then it supports PhysX. Having other components in the system shouldn?t make a difference.

What?s next, disabling a NIC?s functionality if it detects another NIC in the system?

It supports PhysX in an nVidia rendered scenario. You didn't buy a PhysX card, you bought a Graphics Card with PhysX support. From what I know, Aegia PPU still works with ATi cards.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Qbah
Originally posted by: Keysplayr
Originally posted by: Elfear
Originally posted by: akugami
Originally posted by: Keysplayr
I think Qbah hit it right on the head. A dedicated Nvidia GPU running PhysX in a primary ATI system just "happened" to work. It was never intended to work nor was it intended to be supported. This next step just simply forced it off.

Very very possible but I think that nVidia caught wind of it and seems to be actively making sure that it doesn't work instead of just giving it a nod and wink and saying that officially it's not supported.

Again, it's only my opinion but I think that was a stupid move on their part to actively exclude an ATI+nVidia combo.

+1

The whole thing just leaves a sour taste in my mouth with the way Nvidia does business. I've had just as many Nvidia cards over the years (maybe more) than I have ATI cards since bang for the buck trumps brand loyalty in my book. However, with the way I see Nvidia running things lately, if both Nvidia and ATI offered cards that performed and cost the same, I'd probably pick the ATI card. I'm probably not alone in that sentiment either and it just strikes me that Nvidia isn't winning many customers with the business strategy they've taken.

Hehe, I see Intels business practices are cool with you. Nice i7 you have there. :) :evil:

What has Intel got to do with a topic about PhysX? You can run SLI on Intel and CF too - if anything, nVidia should learn from them! This was such blatant trolling and bad try to rebound it's sad.

Originally posted by: Keysplayr
It's also a DirectX capable card as well. But run it in a Linux system.

And what has this to do with PhysX too? DirectX is a standard running on the Windows Gaming Platform. You're now trying to start the same sh!t Wreckage was doing? It's as relevant as buying a gasoline car and and then being upset that it doesn't run diesel! Totally irrelevant and, frankly speaking, silly.


EDIT: So that we're clear, I understand the decision behind this and don't really have a problem with it But please, you're a spokesman for them, use some good arguments and not the things I quoted...

Why do I have to explain to you why the above references were used? Can't you see that for yourself? It was a reference. A comparison. Good Lord, they don't understand something, it's trolling.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: thilan29
Originally posted by: SirPauly
It may be technical in nature and nVidia may feel they can't invest the resources to make sure current and future PhysX content runs smoothly on other IHV's GPU's.

It isn't running on another IHV's GPU. It's still running on an nVidia GPU. Also, the Ageia cards worked fine regardless of the GPU in the system so we know it works...it just seems like it's being blocked artificially like SLI was.

What do we really know? Ben Skywalker did bring up a solid point.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: BenSkywalker

I have an All In Wonder here that has a TV tuner and video capture hardware on it, never would work with any other vid card in any PC, same thing?
It depends; do ATi?s drivers take previously working functionality and specifically disable it if they detect a competitor?s card in the system? If not then no, because a driver that never worked with multiple GPUs in the system is not the same thing as actively blocking features that once worked.

When did it get blocked and by whom? I know Crossfire worked on NF4SLI mobos back in the x1800xt days. ASRock was demoing an NFi740 board with Crossfire running also, although I honestly don't know if that ever got official support or not.
Yes, CF unofficially once worked on NF4 given it was used during CF?s testing phase. However that soon stopped after nVidia blocked it, and as of right now there?s not a single nVidia chipset listed on ATi?s web as supported. Now, if you have legitimate and recent benchmarks of CF working on nVidia chipsets then I?ll be happy to look at them.

Also ULi boards were quickly shut down by nVidia and while it might be possible to hack some kind of CF support, the active vendor blocking by nVidia makes it impossible to simply pick up an nVidia chipset off the shelf and expect CF to work.

If it is blocked by the drivers, it would be ATi doing it, not nVidia. The BIOS would need to have support added, which would be up the OEM.
No, it?s blocked at the BIOS/chipset driver level, something about peer-to-peer writes being disabled on PCIe if two non-nVidia video cards are detected at boot.

Also I remember an interview about CF and ATi were asked why it doesn?t work on nVidia chipsets, to which the answer was ?ask nVidia?.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Qbah

Well, those are assumptions.
No, because I?m not assuming anything. I?m proposing a hypothetical (and rather similar) situation to the intentional blocking of PhysX we have now.

Again, everything you said equally applies to Havok or any other situation where a vendor artificially disables features simply because competition is detected in the system. This is no different to SLI being locked to nVidia chipsets in the past.

Yes, it is a vendor lock-in. But it's their technology, they own it, it's not an open standard. They can do whatever they want with it. And people will follow (or in this case, won't :p) with their wallets.
Right, but again that doesn?t change the fact that it?s bad for the customer because it limits choices, which is what I?m arguing. There?s no good reason why PhysX should be disabled in this situation given it worked before, and because it?s not related to the competing video card.

It supports PhysX in an nVidia rendered scenario. You didn't buy a PhysX card, you bought a Graphics Card with PhysX support.
Right, but where does it state on the box that PhysX will be disabled if a competitor?s card is in the system? Where does it state the minimum requirements for PhysX are a system filled only with nVidia video cards?

From what I know, Aegia PPU still works with ATi cards.
More evidence that illustrates there?s no good reason to be disabling PhysX on nVidia hardware in said situation.
 

AmberClad

Diamond Member
Jul 23, 2005
4,914
0
0
Originally posted by: BenSkywalker
I have an All In Wonder here that has a TV tuner and video capture hardware on it, never would work with any other vid card in any PC, same thing?
ATI's standalone tuners work fine with either NV or ATI graphics cards from my personal experience (ATI 650), so no, it's not really the same thing. They're not locking you into buying an ATI card to go along with the tuner, and there is no difference functionality-wise if you choose to use an NV card instead.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: SirPauly

What do we really know? Ben Skywalker did bring up a solid point.
What are you talking about? We know PhysX has absolutely nothing to do with ATi cards given there?s no current mechanism to run it on them.

It may be technical in nature and nVidia may feel they can't invest the resources to make sure current and future PhysX content runs smoothly on other IHV's GPU's.
There?s no such requirement given PhysX doesn?t run on other IHV?s GPUs. You?re arguing a scenario that doesn?t even exist and trying to propose it?s the ?reason?.

And even if it was (which it isn?t), we?re talking about running PhysX on nVidia GPUs anyway.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
It depends; do ATi?s drivers take previously working functionality and specifically disable it if they detect a competitor?s card in the system? If not then no, because a driver that never worked with multiple GPUs in the system is not the same thing as actively blocking features that once worked.

It worked with V2s without issue, nothing more recent.

However that soon stopped after nVidia blocked it, and as of right now there?s not a single nVidia chipset listed on ATi?s web as supported.

nV GPU for PhysX with an ATi GPU has never been supported by nVidia officially. It just happened to work with the game we had out, I'd be interested to see how Arkham Asylum works using the older drivers with this setup, seems to me there would be some rather severe issues(not saying for certain there would be, but I'm not seeing how they would handle having seperate boards at reasonable performance levels).

No, it?s blocked at the BIOS/chipset driver level, something about peer-to-peer writes being disabled on PCIe if two non-nVidia video cards are detected at boot.

That would be BIOS level, and up to the vendors. Intel also had issues with CF boards despite following specs, they worked out the issues through the BIOS and got it working however.

Honestly this whole thing would have been much simpler if ATi just had PhysX running on their hardware to start with.

ATI's standalone tuners work fine with either NV or ATI graphics cards from my personal experience (ATI 650), so no, it's not really the same thing. They're not locking you into buying an ATI card to go along with the tuner, and there is no difference functionality-wise if you choose to use an NV card instead.

nVidia's stand alone physics cards are reportedly still working with ATi boards so yes, it is very much exactly the same thing.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: BenSkywalker
If the card?s box says it supports PhysX then it supports PhysX. Having other components in the system shouldn?t make a difference.

I have an All In Wonder here that has a TV tuner and video capture hardware on it, never would work with any other vid card in any PC, same thing?

I doubt it. All In Wonder cards were always extremely temperamental regarding drivers. Just getting them working properly in a system with no other add in cards could be a challenge sometimes. It was a nice piece of hardware let down by crappy drivers.

This PhysX situation is a deliberate driver lockout, the same thing they did back after they bought out ULi and purposely disabled the SLI patch in Forceware drivers v81.98 and later.

http://www.nforcershq.com/nvid...sables-ulis-sli-patch/