physx on ati

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Originally posted by: bryanW1995
Dragon Age is going to be my big video game purchase this year and it uses...physix.

Does it actually use gpu accelerated physics? If it doesn't then it doesn't matter which card you get since it will run on the CPU anyway. There are already a lot of games using PhysX but VERY FEW that have GPU-accelerated physx effects.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,722
418
126
Originally posted by: bryanW1995
don't get me wrong, I'm pissed about this as a consumer, I just think that it was a good business move for them. Even if gt300 and other future models don't offer physix at all, it was still a good talking point for a while. It still gave them something to talk about when they appeared to be on a path to gpu annihilation. It will still keep some people from buying a faster amd card in hopes that they can get a gt 300 and use their 8800/9800 as a ppu. Shit, I might end up doing that, though I'll probably end up grabbing a 5xxx first if the deals are good.

I understand that.

I'm just not sure they will get enough in return to justify the hassle (more a bit or grudge by some costumers than anything else).

Maybe they will.

What I really find is this being irrelevant one way or the other. I hope the best nVidia has to counter AMD/ATI (from a consumer that likes to pay the least possible) isn't physX. It was slim before and it is even slimmer now.
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
Originally posted by: nitromullet
I meant the entire issue of the lockout. The discrete/non-discrete debate is just an aside to the lockout, which is really the central issue of concern. I think I mentioned in passing that it wasn't a discrete PhysX card, and you challenged me on that. So, I responded. It seemed to be important to you at the time, but seems to have lost importance the less you are able to support your position.

While I can't speak for Modular, I for one have noticed that since you've become a "NVIDIA Focus Group" member, your posts have taken on a decidedly NVIDIA slant. I think it stands to reason that given a debate between ATI and NVIDA individuals more closely involved with one would probably tend to lean in that direction. Given that, I don't really fault you for siding with NV when the issue is between NV and ATI. I do however take into consideration that you get free stuff from NVIDIA and apply the requisite grain of salt to your posts.

The thing about this debate is that it really isn't about ATI vs. NV, but about NVIDIA vs. its own customers. Why anyone would side with the company essentially screwing over its existing customers is beyond me.

That sums up my stance pretty well actually. And bryanW1995 is right; from a business standpoint, this is a good decision in a way, but it poses some serious business problems as well. The problem is that I, since I currently have an Ati card, no longer have a reason to buy an nVidia card. If nVidia was smart, they would have made PhsX work with an Ati card for the sole reason that they would have had sales from those already using an nVidia card, as well as those who wanted to purchase a new nVidia card to use alongside their Ati primary card.

The fact that they didn't care that they were going to lose sales as a result of this decision makes me believe that it's a company either running scared, or acting like a 4 year old as I said before. Hell, it's probably a combination of both in the end. They have nothing to counter the new Ati cards afterall...

 

Mem

Lifer
Apr 23, 2000
21,476
13
81
David Hoff ex Nvidia now Director of AMD's Advanced Technology
Initiatives team says it all here.


As you heard me describe [at the AMD VISION event], in the meantime, we've been particularly excited about what Pixelux can do. Their physics effects are amazingly realistic compared to anyone else. And their tools are great.

Their commitment to integrating with the free, open source Bullet Physics engine and doing OpenCL acceleration fits great with our commitment to OpenCL work on Bullet. Both Bullet Physics and Pixelux's DMM engine are already available and used in games and films, so developers can start right now and pick up the GPU acceleration as we role that out.

On the other hand, as I think you've seen from the PhysX side of things, while they seem to talk about support for openness when they're backed into a corner, apparently in a recent driver update they've actually disabled PhysX running on their GPU if an ATI card is used for rendering in order to pressure users to use an all Nvidia configuration.

The contrast should be fairly stark here: we're intentionally enabling physics to run on all platforms - this is all about developer adoption. Of course we're confident enough in our ability to bring compelling new GPUs to market that we don't need to try to lock anyone in.

 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: thilan29
Originally posted by: bryanW1995
Dragon Age is going to be my big video game purchase this year and it uses...physix.

Does it actually use gpu accelerated physics? If it doesn't then it doesn't matter which card you get since it will run on the CPU anyway. There are already a lot of games using PhysX but VERY FEW that have GPU-accelerated physx effects.

Damn... is this true? Will it be GPU PhysX? DAO is the only reason I'm considering a new graphics card since I enjoy my Xbox a lot and don't really want to bother with PC issues. I wanted to do an exception for DAO... Damn... Let's hope it's CPU PhysX so I can choose a card based on what's the fastest in my price category.


EDIT: w00t! From the DAO forums, posted by Ross Gardner, Lead Programmer at Bioware:

Nvidia now owns PhysX, and have put in full hardware support for it on all their latest cards - and some games have added hardware acceleration specific features to their games (that you'd only see on Nvidia cards). We didn't do any of those on DAO so you're fine with an ATI. Technically there would be some improvement to have a hardware PhysX card - but it might be so minor you'd never notice.

Looks like we're safe :)
 

dadach

Senior member
Nov 27, 2005
204
0
76
Originally posted by: Keysplayr


This soap opera you see playing out is a result of ATI denying, or could not provide it's own users of this, for now completely useless technology.

fixed it for you
 

Mr Fox

Senior member
Sep 24, 2006
876
0
76
Originally posted by: Mem
David Hoff ex Nvidia now Director of AMD's Advanced Technology
Initiatives team says it all here.


As you heard me describe [at the AMD VISION event], in the meantime, we've been particularly excited about what Pixelux can do. Their physics effects are amazingly realistic compared to anyone else. And their tools are great.

Their commitment to integrating with the free, open source Bullet Physics engine and doing OpenCL acceleration fits great with our commitment to OpenCL work on Bullet. Both Bullet Physics and Pixelux's DMM engine are already available and used in games and films, so developers can start right now and pick up the GPU acceleration as we role that out.

On the other hand, as I think you've seen from the PhysX side of things, while they seem to talk about support for openness when they're backed into a corner, apparently in a recent driver update they've actually disabled PhysX running on their GPU if an ATI card is used for rendering in order to pressure users to use an all Nvidia configuration.

The contrast should be fairly stark here: we're intentionally enabling physics to run on all platforms - this is all about developer adoption. Of course we're confident enough in our ability to bring compelling new GPUs to market that we don't need to try to lock anyone in.




The Drama Continues, and now the Red Channel Has Spoken...


AMD respond to NVIDIA's tough Radeon HD 5800 questions

http://www.tweaktown.com/news/...00_questions/index.htm