physx on ati

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Keysplayr

Then what are we talking about here?
If your concerns are not about PhysX capability, then nothing should "irk" you. You can go right out and get two NV cards for SLI or Two ATI cards for Crossfire. Or whatever you want.

We're talking about the lock out, which clearly isn't the most popular move. You (again) are trying to downplay the issue by trivializing my concerns.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: nitromullet
Originally posted by: Keysplayr

Then what are we talking about here?
If your concerns are not about PhysX capability, then nothing should "irk" you. You can go right out and get two NV cards for SLI or Two ATI cards for Crossfire. Or whatever you want.

We're talking about the lock out, which clearly isn't the most popular move. You (again) are trying to downplay the issue by trivializing my concerns.

Whatever you say, Nitro.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: bryanW1995
why are you complaining? I would be upset (and have been before) when TWIMTBP titles DIDN'T perform better on nvidia hardware. What's the freakin' point of that if there's no difference? If the the dev takes nvidia's money to slap the logo up there, the least they could do is to throw nvidia end-users a few bones.

Because the best TWIMTBP should do is to make the games to run faster on nVidia hardware, not just exclusive hampering ATi's performance badly, like it happened a few times before in games like Doom 3, Mass Effect, Lost Planet etc. ATi's GIG games usually runs fast on nVidia hardware, but faster on ATi hardware, that's more fair and will not affect the sales of the game.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: nitromullet
Originally posted by: Keysplayr

Then what are we talking about here?
If your concerns are not about PhysX capability, then nothing should "irk" you. You can go right out and get two NV cards for SLI or Two ATI cards for Crossfire. Or whatever you want.

We're talking about the lock out, which clearly isn't the most popular move. You (again) are trying to downplay the issue by trivializing my concerns.

Don't worry about it, nitro. Everybody here understands where you're coming from and why you would be upset about it. And most probably agree with your sentiments. It was a very unwise move by Nvidia that will only serve to alienate some of their clientele, such as yourself.

Nvidia is shooting themselves in the foot with this, especially with the 5870 supposedly already running hardware based physics. I imagine it will probably turn out to be Havok. Apparently nobody told Huang that deliberately reducing the number of people who can run PhysX probably isn't the best way to increase its adoption rate with developers and the public.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Nvidia is shooting themselves in the foot with this, especially with the 5870 supposedly already running hardware based physics. I imagine it will probably turn out to be Havok. Apparently nobody told Huang that deliberately reducing the number of people who can run PhysX probably isn't the best way to increase its adoption rate with developers and the public.

They probably want to have their cake and eat it too. They don't just want to increase adoption PhysX, but they want it to happen solely with using their cards and not anyone else's. Time will tell if it was a good move or not.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Originally posted by: Pantalaimon
Nvidia is shooting themselves in the foot with this, especially with the 5870 supposedly already running hardware based physics. I imagine it will probably turn out to be Havok. Apparently nobody told Huang that deliberately reducing the number of people who can run PhysX probably isn't the best way to increase its adoption rate with developers and the public.

They probably want to have their cake and eat it too. They don't just want to increase adoption PhysX, but they want it to happen solely with using their cards and not anyone else's. Time will tell if it was a good move or not.

Problem is gamers hate to be forced to one side so until an open standard format is made available for both sides(in the real gaming world) the battle will continue, you can argue Nvidia have the right to protect their interests with PhysX etc ..however gamers have the right to vent or show disapproval if they feel Nvdia is handling the situation badly ie restricting PhysX support for a gamer with Nvidia PhysX /AMD video card combo system.

End of the day consumers/gamers are the bread and butter for their cards,if they(Nvidia) want to hurt their own sales so be it.
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
Originally posted by: Keysplayr
System with a primary Nvidia card and a dedicated PhysX card. What is this card?

System with a primary ATI card and a Nvidia card. What is this card?

Call it what you want, but when you have an all Nvidia setup, for example a GTX275 in the primary slot, and say any Nvidia GPU 8600GT or greater in the secondary slot for PhysX, that card is a discrete PhysX card because that is it's intended use. You can call it a GeForce GPU that also happens to do PhysX. But that would be incorrect. The only function that second card could have when it pertains to gaming, is PhsyX. Of course, it can also be used in multi monitor situations.

Not really interested in the semantics thereafter.

Wow Keys, you're way off the deep end here pal.

Of course you aren't interested in the "semantics" because they are grounded in the logic and putting the consumer first (see above). Here's the reality bud:

In your example, you claim that the 8600gt is a discrete PhsyX card because that's the only purpose it serves, right? But if I have an Ati GPU as my primary and a 8600gt then it's no longer usable for PhsyX. Therefore, there's no logical way that one could conclude that it's a discrete card whatsoever, and if it's not discrete in situation "x", what makes you believe it counts as discrete is situation "y"?

We all know the answer to that question, so there's really no need for you to reply.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Modular
Originally posted by: Keysplayr
System with a primary Nvidia card and a dedicated PhysX card. What is this card?

System with a primary ATI card and a Nvidia card. What is this card?

Call it what you want, but when you have an all Nvidia setup, for example a GTX275 in the primary slot, and say any Nvidia GPU 8600GT or greater in the secondary slot for PhysX, that card is a discrete PhysX card because that is it's intended use. You can call it a GeForce GPU that also happens to do PhysX. But that would be incorrect. The only function that second card could have when it pertains to gaming, is PhsyX. Of course, it can also be used in multi monitor situations.

Not really interested in the semantics thereafter.

Wow Keys, you're way off the deep end here pal.

Of course you aren't interested in the "semantics" because they are grounded in the logic and putting the consumer first (see above). Here's the reality bud:

In your example, you claim that the 8600gt is a discrete PhsyX card because that's the only purpose it serves, right? But if I have an Ati GPU as my primary and a 8600gt then it's no longer usable for PhsyX. Therefore, there's no logical way that one could conclude that it's a discrete card whatsoever, and if it's not discrete in situation "x", what makes you believe it counts as discrete is situation "y"?

We all know the answer to that question, so there's really no need for you to reply.

Whatever floats your boat there sparky. This big debate whether or not a dedicated PhysX GPU is considered a discrete is beyond fail and extremely unimportant. Giving it a name or title, or stripping it, is so........pointless. Like I said about 12 posts back, this is all about venting. So have at it...... bud.

/peace.
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
Every time i see and Physx logo, or something about physx, which is turning into a regular basis... I get so pissed off!!!!! ATi with their stupid havok crap, is like Ati got your cash and give u a card and now is nothing new for u.. and nvidia got all the toys....
 

dflynchimp

Senior member
Apr 11, 2007
468
0
71
lol. Strickly business folks, no hard feelings. Hasn't Intel aos gotten on the havok boat?

The logical mindset will see that both PhysX and Havok are middle-of-the-road products and fledglings in the field of gaming physics. Eventually they will converge and the coding will match up because inevitably, there's only one set of Newton's laws.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Keysplayr
Whatever floats your boat there sparky. This big debate whether or not a dedicated PhysX GPU is considered a discrete is beyond fail and extremely unimportant. Giving it a name or title, or stripping it, is so........pointless. Like I said about 12 posts back, this is all about venting. So have at it...... bud.

/peace.

The issue may be unimportant to you. I think you've established this. It clearly does matter to some of us though.

 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
Originally posted by: Keysplayr


Whatever floats your boat there sparky. This big debate whether or not a dedicated PhysX GPU is considered a discrete is beyond fail and extremely unimportant. Giving it a name or title, or stripping it, is so........pointless. Like I said about 12 posts back, this is all about venting. So have at it...... bud.

/peace.

It's interesting, because you were the one that took the time to defend the idea that it was a discrete PhsX chip...hmmm, now that you've been called out, you're backpeddling. Awesome!

The only thing that makes me upset about this thread is that you used to be a useful contributor to the Video Card forum and now you've become blinded and useless. It's sad I guess...

/<3

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: dflynchimp
lol. Strickly business folks, no hard feelings. Hasn't Intel aos gotten on the havok boat?

The logical mindset will see that both PhysX and Havok are middle-of-the-road products and fledglings in the field of gaming physics. Eventually they will converge and the coding will match up because inevitably, there's only one set of Newton's laws.

Intel didn't just get on the Havok boat, they bought the whole boat and crew two years ago.

http://www.joystiq.com/2007/09...icks-up-havok-for-21m/





 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: nitromullet
Originally posted by: Keysplayr
Whatever floats your boat there sparky. This big debate whether or not a dedicated PhysX GPU is considered a discrete is beyond fail and extremely unimportant. Giving it a name or title, or stripping it, is so........pointless. Like I said about 12 posts back, this is all about venting. So have at it...... bud.

/peace.

The issue may be unimportant to you. I think you've established this. It clearly does matter to some of us though.

Continuing to debate whether or not to label an Nvidia GPU discrete or not is unimportant to me. Yes. Someone actually tried to call it semantics based on logic.

@ Modular: You're definition of blinded and useless is somebody who doesn't see things your way? If this is incorrect, please clear it up for me.

 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Uhh.. I was under the impression that ALL graphics cards were considered "discrete". That's the term the whole industry uses.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Discrete would mean it should work independently of other hardware in the system. A printer is discrete because it will print whether you have competing company printers plugged in or not, the drivers don't even begin to interfere with one another. A monitor is also discrete. Using an Nvidia card as a GPU would be the only way one could call it discrete. If you use it as a PhysX card you have to meet certain other requirements. It isn't discrete because it doesn't work with an agnostic view towards the rest of your system.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
I don't get it, being able to use the card later as a physx/cuda card would be a big reason for me to buy nVidia if all else was equal, now they removed that incentive.

Having more systems physX capable would be a reason to make more games with physX support.

If Intels new card is any good not having physX work with that in your system would further reduce the market share of it, so I really don't see why they are possibly trying to achieve with their move.
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
Originally posted by: Forumpanda
I don't get it, being able to use the card later as a physx/cuda card would be a big reason for me to buy nVidia if all else was equal, now they removed that incentive.

Having more systems physX capable would be a reason to make more games with physX support.

If Intels new card is any good not having physX work with that in your system would further reduce the market share of it, so I really don't see why they are possibly trying to achieve with their move.

Have you ever seen a kid throw a temper tantrum? That's basically what is happening here. They aren't getting their way, are being childish about it, and in the end are hurting themselves and not forwarding their agenda one bit. All the while, hiding behind the smoke and mirrors idea that it's "someone else's fault" they had to make the decision (that someone else being ATI/Intel, but specifically ATI in this thread).

What we are seeing here is the epitome of a corporation acting like an entitled child. It's kind of amusing to watch, but annoying to see the repercussions to those of us who would like to enjoy and use this new tech. But what do we matter as consumers anyways...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Keysplayr
Originally posted by: nitromullet
Originally posted by: Keysplayr
Whatever floats your boat there sparky. This big debate whether or not a dedicated PhysX GPU is considered a discrete is beyond fail and extremely unimportant. Giving it a name or title, or stripping it, is so........pointless. Like I said about 12 posts back, this is all about venting. So have at it...... bud.

/peace.

The issue may be unimportant to you. I think you've established this. It clearly does matter to some of us though.

Continuing to debate whether or not to label an Nvidia GPU discrete or not is unimportant to me. Yes. Someone actually tried to call it semantics based on logic.

@ Modular: You're definition of blinded and useless is somebody who doesn't see things your way? If this is incorrect, please clear it up for me.

I meant the entire issue of the lockout. The discrete/non-discrete debate is just an aside to the lockout, which is really the central issue of concern. I think I mentioned in passing that it wasn't a discrete PhysX card, and you challenged me on that. So, I responded. It seemed to be important to you at the time, but seems to have lost importance the less you are able to support your position.

While I can't speak for Modular, I for one have noticed that since you've become a "NVIDIA Focus Group" member, your posts have taken on a decidedly NVIDIA slant. I think it stands to reason that given a debate between ATI and NVIDA individuals more closely involved with one would probably tend to lean in that direction. Given that, I don't really fault you for siding with NV when the issue is between NV and ATI. I do however take into consideration that you get free stuff from NVIDIA and apply the requisite grain of salt to your posts.

The thing about this debate is that it really isn't about ATI vs. NV, but about NVIDIA vs. its own customers. Why anyone would side with the company essentially screwing over its existing customers is beyond me.

Originally posted by: Kakkoii
Uhh.. I was under the impression that ALL graphics cards were considered "discrete". That's the term the whole industry uses.

NVIDIA graphics cards are discrete graphics cards, they just aren't discrete PhysX cards. "Discrete" isn't just a term the industry uses, it does actually mean something. Since a NVIDIA video card used solely for PhysX requires a NVIDIA card as the primary gpu, I do not think it qualifies as a discrete card.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
426
126
There is really no point arguing - nVidia decided to not allow their cards to do physX if ATI cards present, ie, "don't u dare buy stuff from the other camp!".

Sincerely its a short sighted move from nV because in the end it will be pointless.

ATI hardware is capable of generating physic effects - I won't be surprised if the 4xxx series will also get some physics effects when the 5xxx series is released, but even if not, you can count that when physics become mainstream in gaming AMD/ATI will be able to sell you products that can do those effects.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
i think that it's a great move by nvidia. They are soon to be eclipsed by ati ( or, as idc would say, the graphics brand that amd uses to peddle their gpus ) and won't have an answer for somewhere between 2 and 6 months. The only thing that nvidia has to hang their hat on right now that amd can't do is...physix and cuda. physix could possibly matter to even a low end gpu purchaser and could in the long run go a long way toward keeping nvidia customers loyal, keeping devs in their pocket, and in general causing fits for amd and, later, intel. It's very telling that they used a physix argument to combat the latest ati air assault from the USS hornet. Don't get me wrong, I'm not happy about this but I might end up getting a gt300 instead 5xxx because Dragon Age is going to be my big video game purchase this year and it uses...physix.

Speaking of which, shouldn't they have used a canadian ship, or at least one that used to belong to canada but was in the US now? :confused:
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
426
126
Originally posted by: bryanW1995
i think that it's a great move by nvidia. They are soon to be eclipsed by ati ( or, as idc would say, the graphics brand that amd uses to peddle their gpus ) and won't have an answer for somewhere between 2 and 6 months. The only thing that nvidia has to hang their hat on right now that amd can't do is...physix and cuda. physix could possibly matter to even a low end gpu purchaser and could in the long run go a long way toward keeping nvidia customers loyal, keeping devs in their pocket, and in general causing fits for amd and, later, intel.

Thing is AMD/ATI GPUs hardware can accelerate physics (or you believe AMD/ATI don't have the capability dispite when Ageia first debuted they answered with demos of 2CF X19xx + 1 X19xx card for physics?). They just don't use physX - actually they could use it if they had agreed with nVidia on terms, but they didn't want to.

With DX11 including opencl + microsoft interest in both keeping DX as a standard and their interest on having physics on their xbox games regardless of the choice of GPU provider, do you really believe that nVidia will be able to restrict physics to PhysX?

The most they get is a couple of months - when physics in game becomes something other than a tech curiosity in some small numbers of games, you can bet developers will at least present a path for AMD/ATI GPUs accelerate physics at the same time they offer physX. At worst they will drop physX as any nVidia compliant DX11 will also be able to use opencl.

And imagine they can restrict it.

What happens then if Intel and AMD decide their CPUs will shutdown when physiX starts to work?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
don't get me wrong, I'm pissed about this as a consumer, I just think that it was a good business move for them. Even if gt300 and other future models don't offer physix at all, it was still a good talking point for a while. It still gave them something to talk about when they appeared to be on a path to gpu annihilation. It will still keep some people from buying a faster amd card in hopes that they can get a gt 300 and use their 8800/9800 as a ppu. Shit, I might end up doing that, though I'll probably end up grabbing a 5xxx first if the deals are good.