"Inevitable Bleak Outcome for nVidia's Cuda + Physx Strategy"

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
Originally posted by: Wreckage
I think the ATI fans are worried that PhysX could be the final nail in the ATI coffin. That's why they are so concerned with it. It's free and it enhances games, yet they are rallying against it.

Whatever. Good luck with that.

I'm happy with it. Game developers keep signing up for it. 67% of the video cards sold last quarter support it. Pure success if you ask me.

Check out this PhysX video for Sacred 2. Awesome! What's not to love?
http://www.youtube.com/watch?v=jTrEnFCoYNE

You and your pocket book better hope this is not the last nail in the coffin for ATi. If so, your much mentioned GTX 260 deal will be a thing of the past...

As for me, a person with an ATi card (3 nVidia cards as well) I'm not rallying against PhysX, I'm rallying against a technology that does not work on all video cards. I'm all for physics in games, just not for something that not everyone can use. If this is truely the future of physics and ATi cards are never able to utilize them then we will all be hurt by it. I don't know about you but I'm loving the deals that can be had right now, I would hate to go back to the days of $300+ mid-range cards.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
We all look at things differently.

PhysX is a vehicle to provide advanced Physic content to many platforms and scalable tools like APEX may make this job easier for developers. I don't think this is marketing but reality.

PhysX GPU physics is a vehicle to raise the bar of dynamic content for games. Of course game-changing content that re-defines the experience take time and PhysX may offer visual features to enhance atmosphere, visually pleasing and to some levels -- inter-action while some wait for this game-changing content.

You're asking me as gamer ---throw away all the positives of what PhysX may offer and just ignore this. Just wait for only OpenCL and Compute Shader content or when AMD or Intel are ready?



 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Some of these arguments against physx are interesting. Imagine if all 3D games were rendered in grayscale only but then one of the two GPU makers figured out a way to render in color.

Would you turn it down just because it (color) wasn't necessary to play out the games at the time? Would you refuse to accept that playing a game in color just simply looks cooler and feels more awesome than playing in BW?

I'll admit I don't play any of the games you all talk about here, I play games about 2-3 yrs from the leading edge. NWN2 was my most recent game just to give you an idea. But when I look at the videos of these games and I see the special effects involved from physx I can't help but be a little bit in awe at just how darn cool it looks to me.

To debate whether flittering cloth or funky looking raindrops are neat or not just seems weird to me in an enthusiast forum. You don't have to be excited about it, but to go to a forum where most people are enthusiasts meaning they are eager and excited to see where the next-gen stuff is headed and then to try and proceed to convince those people they are wrong to care about where the next-gen stuff is headed...just seems weird.

For example I could not care less about the current marketscape for pickup trucks, but I own one. Further I am not about to join a pickup truck forum where the vocal crowd is excited about the features of the 2010 models and features, etc, only to proceed to argue with those folks that just because I could not care less about the features of a 2010 model pickup that they too should stop caring about it.

If physx is nothing of value then history will bear that out as the number of games to incorporate it will be negligible. I won't know for another 3 yrs or so when I start to play these games of hot debate around here. :laugh:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Spike
Originally posted by: Wreckage
I think the ATI fans are worried that PhysX could be the final nail in the ATI coffin. That's why they are so concerned with it. It's free and it enhances games, yet they are rallying against it.

Whatever. Good luck with that.

I'm happy with it. Game developers keep signing up for it. 67% of the video cards sold last quarter support it. Pure success if you ask me.

Check out this PhysX video for Sacred 2. Awesome! What's not to love?
http://www.youtube.com/watch?v=jTrEnFCoYNE

You and your pocket book better hope this is not the last nail in the coffin for ATi. If so, your much mentioned GTX 260 deal will be a thing of the past...

As for me, a person with an ATi card (3 nVidia cards as well) I'm not rallying against PhysX, I'm rallying against a technology that does not work on all video cards. I'm all for physics in games, just not for something that not everyone can use. If this is truely the future of physics and ATi cards are never able to utilize them then we will all be hurt by it. I don't know about you but I'm loving the deals that can be had right now, I would hate to go back to the days of $300+ mid-range cards.

Spike, the reality of it is, everyone "can" use it. Nothing is stopping you from purchasing an PhysX supported video card next round. Nothing at all. You make it sound as if a substantial portion of the gaming population will be left out in the cold with no PhysX capability. So, it doesn't have to work on all video cards. It just has to work on the one that you are free to buy. Nothing stopping you but yourself. You are free to buy a PhysX capable card if you wished to.
It might sound silly to you, and I might agree, but that's just the way it is. All those in support of the underdog (in this case ATI) and go against the grain of what is underway, will lose out. Even if the PhysX effects are minimal, you still won't have them. Would you buy a card that didn't do AA? AA is barely tangible except improving image quality. Offer no effects or physical change to the game. Would you buy a card without it? I don't think you would.
I know this sounds like marketing, and I'm actually skeeving myself out, but forget that crap. I'm not really being serious here, and you will buy what you want of course, but just trying to make a point. There is nothing wrong with buying and Nvidia card, unless you wish to keep ATI in business and have to buy their cards. I do understand that. I want them to stay around also, but not like this. Not in their current stagnant state. Innovate for christ sakes. Do something everyone will ooohh and aaahhhh about. Even CPU maker Intel has something very cool cooking. INTEL!!
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: Idontcare
Some of these arguments against physx are interesting. Imagine if all 3D games were rendered in grayscale only but then one of the two GPU makers figured out a way to render in color.

Would you turn it down just because it (color) wasn't necessary to play out the games at the time? Would you refuse to accept that playing a game in color just simply looks cooler and feels more awesome than playing in BW?

I'll admit I don't play any of the games you all talk about here, I play games about 2-3 yrs from the leading edge. NWN2 was my most recent game just to give you an idea. But when I look at the videos of these games and I see the special effects involved from physx I can't help but be a little bit in awe at just how darn cool it looks to me.

To debate whether flittering cloth or funky looking raindrops are neat or not just seems weird to me in an enthusiast forum. You don't have to be excited about it, but to go to a forum where most people are enthusiasts meaning they are eager and excited to see where the next-gen stuff is headed and then to try and proceed to convince those people they are wrong to care about where the next-gen stuff is headed...just seems weird.

For example I could not care less about the current marketscape for pickup trucks, but I own one. Further I am not about to join a pickup truck forum where the vocal crowd is excited about the features of the 2010 models and features, etc, only to proceed to argue with those folks that just because I could not care less about the features of a 2010 model pickup that they too should stop caring about it.

If physx is nothing of value then history will bear that out as the number of games to incorporate it will be negligible. I won't know for another 3 yrs or so when I start to play these games of hot debate around here. :laugh:

It's entirely weird. Goes against a gamers soul if you ask me. New technology? "Give it to me NOW" it what I'd expect to hear from every single solitary enthusiast gamer.

Not happening here. But, they won't have a choice soon. ATI is still continuing with their same architecture and throwing tons of shaders at the problem. But they don't understand ( or maybe they do and they're working on a new arch for 3 years down the road HOPEFULLY) that the architecture IS the problem. Even throwing double the shaders they have now into a core will only net them 320 shaders if Vec5 isn't properly coded for. And it won't be if it hasn't by now.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: Keysplayr
Quote from Scali: "I can also see reasons why nVidia's architecture would run better, as OpenCL closely matches Cuda's design, and Cuda's design is based around the nVidia architecture. ATi has a completely different architecture, and has had to add local memory to the 4000-series just to get the featureset right for OpenCL. I doubt that their 'afterthought' design is anywhere near as efficient as nVidia's is."

This is pretty much the reason I am a little confused as to why ATI fans spurn CUDA/PhysX, and are embracing OpenCL. If OpenCL is closely matched to CUDA's design, how well do you think DirectX Compute will run on ATI GPU's (if it does at all, meaning offloads to CPU) compared to Nvidia's current architecture, not to mention GT300?

I think that most people are somewhat agnostic towards PhysX. I personally don't care too much one way or another. I think most people (excepting fanboys and anti-fanboys) can see the potential but also see that PhysX is not in any way guaranteed to win out. If it does, it does and we'll buy hardware that supports it. If it doesn't, then it doesn't and we'll buy whatever hardware supports the proper physics acceleration (and other GPGPU) standards.

This also doesn't discount the fact that for ATI, as a business decision, it was the right move to not support PhysX. Technologies live and die on not only whether they are good or not but on business reasons as well.

Therein lies the rub, nVidia fanboys (not saying you) simply refuse to acknowledge that there are valid points to ATI not using PhysX and that there are valid reasons for gamers (and others dependent on GPU's) to not fawn over PhysX like it's the best thing since sliced bread so to speak.

PhysX is currently the best physics solution but it's still too early in the game to crown a winner. PhysX is a bit underwhelming at the moment, contrary to what the fanboys say, but the potential is there. If one is totally unbiased though, one has to see that there is the potential for PhysX to rule the roost but there is also a great chance that it just falls flat in a year or two when Havok hits back. Same thing with CUDA. It might fall to something developed by Microsoft. Again, business dictates the success of technologies as much as how good the technology is. There have been many potentially great technologies that have been beaten by seemingly inferior rivals for business reasons. The fanboys refuse to believe that.

I'm actually more interested in nVidia GPU's for video encoding than physics acceleration. However that too is still early. Unlike physics acceleration however, it is much more mature and likely to succeed IMHO.

And as an aside, I don't believe nVidia has really designed a GPU for PhysX yet. I think some of their GPU design was meant for their GPGPU uses which also helped PhysX. nVidia didn't buy Ageia until early 2008 and likely most of the design work on what would be put into the GT200 GPU cores was already set in stone. I think PhysX will get a kick in the rear in the next iteration of nVidia's GPU (not the GT300) as they truly start to integrate what they bought from Ageia into their GPU designs.

Originally posted by: Scali
nVidia doesn't NEED PhysX to sell their products. nVidia's products are successful enough on their own. And that's where the 'danger' lies. PhysX will 'sneak into' the market because it piggy-backs onto the sales of nVidia GPUs. Which is why more than 50% of all gamers already have support for PhysX. Since PhysX is free for use unlike Havok, it's very tempting for developers to use it in their games. And since they can then add extra effects with little extra effort for the 50+% of their audience that owns nVidia hardware (and through TWIMTBP nVidia will actually help you add these effects to your games), it is tempting for developers to do so. It can give them a bit of extra flash over competing games and boost sales.

I beg to differ. nVidia's products are wildly successful now but the landscape is set to change dramatically in the next two years. First, Intel is heading into the market and while it would be extremely hard for them to gain market share from hardcore gamers, they can easily use their CPU business for their GPU's to piggyback on. And we all know what physics product Intel will be supporting. Second is both Intel and AMD will be moving towards integrated CPU/GPU's in which the multi-core processor contains not only two or more CPU cores but likely at least one GPU core. As processes get smaller, one can even imagine multi CPU and GPU cores in one package. This cuts nVidia out completely.

From the above perspective, I'd say nVidia might not need PhysX now but they definitely want and need nVidia owned technologies in the market if they wish to stay relevant long term. That is assuming they don't go the Transmeta route and put out an x86 emulated CPU/GPU. Multiple x86 emulated cores along with a GPU.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: Idontcare
Some of these arguments against physx are interesting. Imagine if all 3D games were rendered in grayscale only but then one of the two GPU makers figured out a way to render in color.

Would you turn it down just because it (color) wasn't necessary to play out the games at the time? Would you refuse to accept that playing a game in color just simply looks cooler and feels more awesome than playing in BW?

I'll admit I don't play any of the games you all talk about here, I play games about 2-3 yrs from the leading edge. NWN2 was my most recent game just to give you an idea. But when I look at the videos of these games and I see the special effects involved from physx I can't help but be a little bit in awe at just how darn cool it looks to me.

To debate whether flittering cloth or funky looking raindrops are neat or not just seems weird to me in an enthusiast forum. You don't have to be excited about it, but to go to a forum where most people are enthusiasts meaning they are eager and excited to see where the next-gen stuff is headed and then to try and proceed to convince those people they are wrong to care about where the next-gen stuff is headed...just seems weird.

For example I could not care less about the current marketscape for pickup trucks, but I own one. Further I am not about to join a pickup truck forum where the vocal crowd is excited about the features of the 2010 models and features, etc, only to proceed to argue with those folks that just because I could not care less about the features of a 2010 model pickup that they too should stop caring about it.

If physx is nothing of value then history will bear that out as the number of games to incorporate it will be negligible. I won't know for another 3 yrs or so when I start to play these games of hot debate around here. :laugh:


When I think of PhysX this is what hits my mind-set -- this thought: Tools that allow ART to meet technology:

http://developer.nvidia.com/object/apex.html

The dream is to me is when developers have the tools where the only thing they're limited by is their own imaginations. I knew about Apex last Summer and has been neat to learn how nVidia is building up their tool set for PhysX.


Edit: hehe, going to start to use the preview, too many typos while speed typing.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Keysplayr
It's entirely weird. Goes against a gamers soul if you ask me. New technology? "Give it to me NOW" it what I'd expect to hear from every single solitary enthusiast gamer.

Not happening here. But, they won't have a choice soon. ATI is still continuing with their same architecture and throwing tons of shaders at the problem. But they don't understand ( or maybe they do and they're working on a new arch for 3 years down the road HOPEFULLY) that the architecture IS the problem. Even throwing double the shaders they have now into a core will only net them 320 shaders if Vec5 isn't properly coded for. And it won't be if it hasn't by now.

Not true. Physics isn't exactly a scalar science, there's a lot of vector math involved. Maybe not vec5, but vec2 and vec3 definitely. If you have properly optimized code, you can combine these instructions into AMD's vec5 units, since they are superscalar. It's not like AMD needs to adopt NV's scalar architecture to successfully implement gpu-accelerated physics, they just need to write the appropriate software to support the functionality.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: munky
Originally posted by: Keysplayr
It's entirely weird. Goes against a gamers soul if you ask me. New technology? "Give it to me NOW" it what I'd expect to hear from every single solitary enthusiast gamer.

Not happening here. But, they won't have a choice soon. ATI is still continuing with their same architecture and throwing tons of shaders at the problem. But they don't understand ( or maybe they do and they're working on a new arch for 3 years down the road HOPEFULLY) that the architecture IS the problem. Even throwing double the shaders they have now into a core will only net them 320 shaders if Vec5 isn't properly coded for. And it won't be if it hasn't by now.

Not true. Physics isn't exactly a scalar science, there's a lot of vector math involved. Maybe not vec5, but vec2 and vec3 definitely. If you have properly optimized code, you can combine these instructions into AMD's vec5 units, since they are superscalar. It's not like AMD needs to adopt NV's scalar architecture to successfully implement gpu-accelerated physics, they just need to write the appropriate software to support the functionality.

Gamers love new technology, but smart people hate proprietary technology.

PhysX is proprietary. The NV people seem to be saying that they offered it to AMD for free (which does not make sense). The AMD people are saying that NV won't give them PhysX. The truth probably lies in the middle.

I remember when NV came out with something called "C for graphics" when their FX5800 cards could not run DX9. In a sense, this is a similar situation. The current NV cards are apparently to blame for a castrated DX10, and now they throw this proprietary PhysX at us.

You'd think that NV would learn from their mistakes by now. They need to work with MS and others to develop OPEN standards.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: SickBeast
Originally posted by: munky
Originally posted by: Keysplayr
It's entirely weird. Goes against a gamers soul if you ask me. New technology? "Give it to me NOW" it what I'd expect to hear from every single solitary enthusiast gamer.

Not happening here. But, they won't have a choice soon. ATI is still continuing with their same architecture and throwing tons of shaders at the problem. But they don't understand ( or maybe they do and they're working on a new arch for 3 years down the road HOPEFULLY) that the architecture IS the problem. Even throwing double the shaders they have now into a core will only net them 320 shaders if Vec5 isn't properly coded for. And it won't be if it hasn't by now.

Not true. Physics isn't exactly a scalar science, there's a lot of vector math involved. Maybe not vec5, but vec2 and vec3 definitely. If you have properly optimized code, you can combine these instructions into AMD's vec5 units, since they are superscalar. It's not like AMD needs to adopt NV's scalar architecture to successfully implement gpu-accelerated physics, they just need to write the appropriate software to support the functionality.

Gamers love new technology, but smart people hate proprietary technology.

PhysX is proprietary. The NV people seem to be saying that they offered it to AMD for free (which does not make sense). The AMD people are saying that NV won't give them PhysX. The truth probably lies in the middle.

I remember when NV came out with something called "C for graphics" when their FX5800 cards could not run DX9. In a sense, this is a similar situation. The current NV cards are apparently to blame for a castrated DX10, and now they throw this proprietary PhysX at us.

You'd think that NV would learn from their mistakes by now. They need to work with MS and others to develop OPEN standards.

Is Havok proprietary?


 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: akugami
Originally posted by: Keysplayr
Quote from Scali: "I can also see reasons why nVidia's architecture would run better, as OpenCL closely matches Cuda's design, and Cuda's design is based around the nVidia architecture. ATi has a completely different architecture, and has had to add local memory to the 4000-series just to get the featureset right for OpenCL. I doubt that their 'afterthought' design is anywhere near as efficient as nVidia's is."

This is pretty much the reason I am a little confused as to why ATI fans spurn CUDA/PhysX, and are embracing OpenCL. If OpenCL is closely matched to CUDA's design, how well do you think DirectX Compute will run on ATI GPU's (if it does at all, meaning offloads to CPU) compared to Nvidia's current architecture, not to mention GT300?

I think that most people are somewhat agnostic towards PhysX. I personally don't care too much one way or another. I think most people (excepting fanboys and anti-fanboys) can see the potential but also see that PhysX is not in any way guaranteed to win out. If it does, it does and we'll buy hardware that supports it. If it doesn't, then it doesn't and we'll buy whatever hardware supports the proper physics acceleration (and other GPGPU) standards.

This also doesn't discount the fact that for ATI, as a business decision, it was the right move to not support PhysX. Technologies live and die on not only whether they are good or not but on business reasons as well.

Therein lies the rub, nVidia fanboys (not saying you) simply refuse to acknowledge that there are valid points to ATI not using PhysX and that there are valid reasons for gamers (and others dependent on GPU's) to not fawn over PhysX like it's the best thing since sliced bread so to speak.

PhysX is currently the best physics solution but it's still too early in the game to crown a winner. PhysX is a bit underwhelming at the moment, contrary to what the fanboys say, but the potential is there. If one is totally unbiased though, one has to see that there is the potential for PhysX to rule the roost but there is also a great chance that it just falls flat in a year or two when Havok hits back. Same thing with CUDA. It might fall to something developed by Microsoft. Again, business dictates the success of technologies as much as how good the technology is. There have been many potentially great technologies that have been beaten by seemingly inferior rivals for business reasons. The fanboys refuse to believe that.

I'm actually more interested in nVidia GPU's for video encoding than physics acceleration. However that too is still early. Unlike physics acceleration however, it is much more mature and likely to succeed IMHO.

And as an aside, I don't believe nVidia has really designed a GPU for PhysX yet. I think some of their GPU design was meant for their GPGPU uses which also helped PhysX. nVidia didn't buy Ageia until early 2008 and likely most of the design work on what would be put into the GT200 GPU cores was already set in stone. I think PhysX will get a kick in the rear in the next iteration of nVidia's GPU (not the GT300) as they truly start to integrate what they bought from Ageia into their GPU designs.

Originally posted by: Scali
nVidia doesn't NEED PhysX to sell their products. nVidia's products are successful enough on their own. And that's where the 'danger' lies. PhysX will 'sneak into' the market because it piggy-backs onto the sales of nVidia GPUs. Which is why more than 50% of all gamers already have support for PhysX. Since PhysX is free for use unlike Havok, it's very tempting for developers to use it in their games. And since they can then add extra effects with little extra effort for the 50+% of their audience that owns nVidia hardware (and through TWIMTBP nVidia will actually help you add these effects to your games), it is tempting for developers to do so. It can give them a bit of extra flash over competing games and boost sales.

I beg to differ. nVidia's products are wildly successful now but the landscape is set to change dramatically in the next two years. First, Intel is heading into the market and while it would be extremely hard for them to gain market share from hardcore gamers, they can easily use their CPU business for their GPU's to piggyback on. And we all know what physics product Intel will be supporting. Second is both Intel and AMD will be moving towards integrated CPU/GPU's in which the multi-core processor contains not only two or more CPU cores but likely at least one GPU core. As processes get smaller, one can even imagine multi CPU and GPU cores in one package. This cuts nVidia out completely.

From the above perspective, I'd say nVidia might not need PhysX now but they definitely want and need nVidia owned technologies in the market if they wish to stay relevant long term. That is assuming they don't go the Transmeta route and put out an x86 emulated CPU/GPU. Multiple x86 emulated cores along with a GPU.

Let's discuss the reasoning why ATI didn't.

For ATI to support PhysX they would of had to support Cuda. People offer it was the right move not to support it -- why?







 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Keysplayr
Originally posted by: Idontcare
Some of these arguments against physx are interesting. Imagine if all 3D games were rendered in grayscale only but then one of the two GPU makers figured out a way to render in color.

Would you turn it down just because it (color) wasn't necessary to play out the games at the time? Would you refuse to accept that playing a game in color just simply looks cooler and feels more awesome than playing in BW?

I'll admit I don't play any of the games you all talk about here, I play games about 2-3 yrs from the leading edge. NWN2 was my most recent game just to give you an idea. But when I look at the videos of these games and I see the special effects involved from physx I can't help but be a little bit in awe at just how darn cool it looks to me.

To debate whether flittering cloth or funky looking raindrops are neat or not just seems weird to me in an enthusiast forum. You don't have to be excited about it, but to go to a forum where most people are enthusiasts meaning they are eager and excited to see where the next-gen stuff is headed and then to try and proceed to convince those people they are wrong to care about where the next-gen stuff is headed...just seems weird.

For example I could not care less about the current marketscape for pickup trucks, but I own one. Further I am not about to join a pickup truck forum where the vocal crowd is excited about the features of the 2010 models and features, etc, only to proceed to argue with those folks that just because I could not care less about the features of a 2010 model pickup that they too should stop caring about it.

If physx is nothing of value then history will bear that out as the number of games to incorporate it will be negligible. I won't know for another 3 yrs or so when I start to play these games of hot debate around here. :laugh:

It's entirely weird. Goes against a gamers soul if you ask me. New technology? "Give it to me NOW" it what I'd expect to hear from every single solitary enthusiast gamer.

Not happening here. But, they won't have a choice soon. ATI is still continuing with their same architecture and throwing tons of shaders at the problem. But they don't understand ( or maybe they do and they're working on a new arch for 3 years down the road HOPEFULLY) that the architecture IS the problem. Even throwing double the shaders they have now into a core will only net them 320 shaders if Vec5 isn't properly coded for. And it won't be if it hasn't by now.

No matter how much you tell me Physx is some kind of must have new technology, my opinion of it is that as of now, it is not.

If everything else was equal, if every single feature AMD currently has was available on an equivalent Nvidia card, if they cost pretty much the same, if the performance was pretty much the same, if the bundles/warranty/power usage characteristics (if that's important to you) were the same, but on top of it Nvidia cards supported Physx, than why not get an Nvidia card? It would make perfect sense.

But the truth is that AMD and Nvidia both have some unique features. There is certain technology that you get with an AMD card that you do not get with Nvidia, and vice versa. Depending on your resolution, the games you are most interested in, and the price point one companies card may be a far better choice than the other. From what I've seen of Physx, from what I've read of Physx, and from my limited first hand experience with Physx I would not pay more for an Nvidia card in it's current state. If the Nvidia card was the better buy for what my needs were, then I'd buy it, but I would not buy it because of Physx.

In the future, that could very well change. If Physx continues to mature, games are built from the ground up with it's capabilities in mind - and those capabilities truely offer a much better gaming experience that you just can't have at the same level with out Physx - and it's competitors fail to impress or never materialize, than that could change. Physx could be a 'must-have'. I just do not see it as that right now. And by the time it could be I doubt we'll be wanting to run it on the games that are current at that time on todays video hardware.

If Physx was a must-have now, I don't think it would need so much cheerleading from the pro-Nvidia camp. I would think it would easily stand up on it's own. When C2D launched it did not need people associated with Intel or Intel fan boys to convince those with no affiliation or AMD fan boys that they needed to experience computing on a C2D. Obviously it's easier to compare benchmarks and pretty hard to argue with numbers, so it's not a true apples to apples comparrison. But, it's easy to see what Physx currently has to offer, there are articles and videos all over. It's easy to form an opinion on it just the same, so that's why I chose that as a comparrison.



 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Wreckage
Originally posted by: SickBeast


Gamers love new technology, but smart people hate proprietary technology.

DirectX is proprietary.

Yeah I don't get the argument/statement either. The games you buy to play on your hardware are certainly proprietary.

That crysis game you bought and now never play isn't helping you one bit in playing farcry 2 or whatever.

We plunk down money on individual games that have little relation to the next game we buy but when it comes to the technology in the GPU then we suddenly care whether it is proprietary?

Beast maybe you meant to say one thing but wrote it in a way that is being easily misinterpreted? I'm not following the logic here.

If you want to play FarCry2 you have no choice but to buy this proprietary thing called "FarCry2, the game". Without that peice of proprietary technology you can't play FarCry2.

(insert different game name as desired if FarCry2 isn't hip enough to be relevant to the audience)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Wreckage
Originally posted by: SickBeast


Gamers love new technology, but smart people hate proprietary technology.

DirectX is proprietary.

Yes, however it is not as exclusive as PhysX is, in the sense that all modern GPU hardware will run on DX; you simply have to purchase a copy of Windows. With PhysX, it will run on NV hardware, and that's all.

I would personally much rather have everyone using OpenGL over DX, but that is for another discussion.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Wreckage
Originally posted by: SirPauly

Is Havok proprietary?

As much as PhysX is.

It's no wonder that neither technology is taking off. It's similar to Blu Ray vs. HD DVD. The best bet for the consumer is to wait it out for a standard to be established. Otherwise it's a complete waste of money for the people that chose the wrong standard.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
Originally posted by: SickBeast


Gamers love new technology, but smart people hate proprietary technology.

DirectX is proprietary.

Not in the same way that PhysX and Cuda is. DX is hardware agnostic, it runs fine on any HW as long as the required support is there. Cuda only runs on NV hardware, that's a big difference. It goes along the same reasoning why many people refuse to buy the iPhone - because it's also locked in to Apple's proprietary SW and accessories.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Idontcare
Originally posted by: Wreckage
Originally posted by: SickBeast


Gamers love new technology, but smart people hate proprietary technology.

DirectX is proprietary.

Yeah I don't get the argument/statement either. The games you buy to play on your hardware are certainly proprietary.

That crysis game you bought and now never play isn't helping you one bit in playing farcry 2 or whatever.

We plunk down money on individual games that have little relation to the next game we buy but when it comes to the technology in the GPU then we suddenly care whether it is proprietary?

Beast maybe you meant to say one thing but wrote it in a way that is being easily misinterpreted? I'm not following the logic here.

If you want to play FarCry2 you have no choice but to buy this proprietary thing called "FarCry2, the game". Without that peice of proprietary technology you can't play FarCry2.

(insert different game name as desired if FarCry2 isn't hip enough to be relevant to the audience)

By proprietary I meant that PhysX is a closed standard which will only run on NV hardware. I'm somewhat shocked that you all did not comprehend that immediately.

IME, anything proprietary (or closed, whatever you want to call it) on a GPU has been a failure, with very few exceptions (Glide being the big one).

C for graphics was a failure.
DX10.1 was a failure.
PhysX is so far a failure.
DX8.1 was a failure.
HDR lighting was a failure early on.

I will say that each of these technologies so far has gone on to become a success (aside from C for graphics AFAIK). The good news is that PhysX will probably eventually find its way into some sort of open standard.

Even Glide was a failure if you look at things over the long term.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SlowSpyder
If Physx was a must-have now, I don't think it would need so much cheerleading from the pro-Nvidia camp.

One could just as easily say that if PhysX was not such a threat it would not need so much bashing from the anti-NVIDIA camp.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


Gamers love new technology, but smart people hate proprietary technology.

DirectX is proprietary.

Yes, however it is not as exclusive as PhysX is, in the sense that all modern GPU hardware will run on DX; you simply have to purchase a copy of Windows. With PhysX, it will run on NV hardware, and that's all.

I would personally much rather have everyone using OpenGL over DX, but that is for another discussion.

LOL! I love how you add "you simply have to purchase a copy of Windows". Which invalidates your whole statement.

Well anyone can run PhysX, you simply have to buy an NVIDIA card.

Ha!!!
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Wreckage
I think the ATI fans are worried that PhysX could be the final nail in the ATI coffin. That's why they are so concerned with it. It's free and it enhances games, yet they are rallying against it.

Whatever. Good luck with that.

I'm happy with it. Game developers keep signing up for it. 67% of the video cards sold last quarter support it. Pure success if you ask me.

Check out this PhysX video for Sacred 2. Awesome! What's not to love?
http://www.youtube.com/watch?v=jTrEnFCoYNE

Actually wreckage I did like that Demo. But Meteor was way better.

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Wreckage
Originally posted by: SickBeast
Originally posted by: Wreckage
Originally posted by: SickBeast


Gamers love new technology, but smart people hate proprietary technology.

DirectX is proprietary.

Yes, however it is not as exclusive as PhysX is, in the sense that all modern GPU hardware will run on DX; you simply have to purchase a copy of Windows. With PhysX, it will run on NV hardware, and that's all.

I would personally much rather have everyone using OpenGL over DX, but that is for another discussion.

LOL! I love how you add "you simply have to purchase a copy of Windows". Which invalidates your whole statement.

Well anyone can run PhysX, you simply have to buy an NVIDIA card.

Ha!!!

O RLY?

DX runs on all graphics cards. PhysX only runs on NV cards.

Please don't make me write that a third time. Read Munky's post while you're at it.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: SlowSpyder
If Physx was a must-have now, I don't think it would need so much cheerleading from the pro-Nvidia camp.

One could just as easily say that if PhysX was not such a threat it would not need so much bashing from the anti-NVIDIA camp.

I don't believe many here have bashed Physx. I think a lot of people have stated that they think the technology has great potential, but where it is right now does not make them feel it is a 'must-have'.

Just because someone honestly believes Physx is not yet to a point where they need/want it for gaming, or isn't impressed with what it currently has to offer, they are not bashing it.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Wreckage
Originally posted by: SlowSpyder
If Physx was a must-have now, I don't think it would need so much cheerleading from the pro-Nvidia camp.

One could just as easily say that if PhysX was not such a threat it would not need so much bashing from the anti-NVIDIA camp.

Thats not fair statement. Other argument would be that NV marketing PX on forums pisses ATI guys off. Rightly so.

Right nowNV has lead in PH. But right now doesn't matter . DX11 is what matters and that brings free physics to all . Or are ya saying AMD Havok Intel and developers aren't ready for change .