• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

DX11 questionnaire from nVidia

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Why would the issue be the CPU? Most of the effects in Batman seem to be possible with previous tech such as the fog effects and stuff like caution tape that breaks when you cross it. I think there was also some extra glass breaking.
It is the difference between a normal graphics effect and a physics based one. The fog rolling off of your body in Batman is a physics based effect, the fog by itself is a simple graphics effect. The papers being tossed around at your feet when you walk over them is a physics based effect, debri randomly flying around is a simple graphics effect. That the 58xx parts can very, very easily render the visuals involved is not in question in any way, but handling the calculations to have those graphic effects interact with the physical world around them is a task that must currently fall on processors in AMD equipped systems.

Don't get me wrong, it increases the immersion of the player into the game but these effects seem possible before PhysX.
They are absolutely possible, but they are extremely processor intensive. This is where something like PhysX helps out, GPUs have more then an order of magnitude more power at handling these calculations then CPUs do. This could easily be done by another API, someone creating something under OpenCL it the most likely solution, but there is no doubt that GPUs will remain vastly superior to CPUs for physics based calculations for a long while, particularly when comparing them to desktop x86 offerings.

Also, anyone who wants full eye candy usually has a tricked out system anyways so needing a more powerful CPU vs needing an extra video card for PhysX isn't going to be much of a barrier for those who really want the extra effects. I just question whether the effects were possible only with PhysX like nVidia claims.
You can turn them on if you'd like. The numbers I have seen indicate that a 9800GT will be faster with PhysX on then a i7 and 5870, by a decent amount too. The calculations are too intensive for the FP capabilities of current processors.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,606
134
106
Originally posted by: BenSkywalker
Why would the issue be the CPU? Most of the effects in Batman seem to be possible with previous tech such as the fog effects and stuff like caution tape that breaks when you cross it. I think there was also some extra glass breaking.
It is the difference between a normal graphics effect and a physics based one. The fog rolling off of your body in Batman is a physics based effect, the fog by itself is a simple graphics effect. The papers being tossed around at your feet when you walk over them is a physics based effect, debri randomly flying around is a simple graphics effect. That the 58xx parts can very, very easily render the visuals involved is not in question in any way, but handling the calculations to have those graphic effects interact with the physical world around them is a task that must currently fall on processors in AMD equipped systems.

Don't get me wrong, it increases the immersion of the player into the game but these effects seem possible before PhysX.
They are absolutely possible, but they are extremely processor intensive. This is where something like PhysX helps out, GPUs have more then an order of magnitude more power at handling these calculations then CPUs do. This could easily be done by another API, someone creating something under OpenCL it the most likely solution, but there is no doubt that GPUs will remain vastly superior to CPUs for physics based calculations for a long while, particularly when comparing them to desktop x86 offerings.

Also, anyone who wants full eye candy usually has a tricked out system anyways so needing a more powerful CPU vs needing an extra video card for PhysX isn't going to be much of a barrier for those who really want the extra effects. I just question whether the effects were possible only with PhysX like nVidia claims.
You can turn them on if you'd like. The numbers I have seen indicate that a 9800GT will be faster with PhysX on then a i7 and 5870, by a decent amount too. The calculations are too intensive for the FP capabilities of current processors.
I hope you aren't trying to imply that the Radeon cards can't do those physics effects if they are programmed in a language that the radeon cards can run. Cause they can.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: GaiaHunter
Originally posted by: BenSkywalker
Why would the issue be the CPU? Most of the effects in Batman seem to be possible with previous tech such as the fog effects and stuff like caution tape that breaks when you cross it. I think there was also some extra glass breaking.
It is the difference between a normal graphics effect and a physics based one. The fog rolling off of your body in Batman is a physics based effect, the fog by itself is a simple graphics effect. The papers being tossed around at your feet when you walk over them is a physics based effect, debri randomly flying around is a simple graphics effect. That the 58xx parts can very, very easily render the visuals involved is not in question in any way, but handling the calculations to have those graphic effects interact with the physical world around them is a task that must currently fall on processors in AMD equipped systems.

Don't get me wrong, it increases the immersion of the player into the game but these effects seem possible before PhysX.
They are absolutely possible, but they are extremely processor intensive. This is where something like PhysX helps out, GPUs have more then an order of magnitude more power at handling these calculations then CPUs do. This could easily be done by another API, someone creating something under OpenCL it the most likely solution, but there is no doubt that GPUs will remain vastly superior to CPUs for physics based calculations for a long while, particularly when comparing them to desktop x86 offerings.

Also, anyone who wants full eye candy usually has a tricked out system anyways so needing a more powerful CPU vs needing an extra video card for PhysX isn't going to be much of a barrier for those who really want the extra effects. I just question whether the effects were possible only with PhysX like nVidia claims.
You can turn them on if you'd like. The numbers I have seen indicate that a 9800GT will be faster with PhysX on then a i7 and 5870, by a decent amount too. The calculations are too intensive for the FP capabilities of current processors.
I hope you aren't trying to imply that the Radeon cards can't do those physics effects if they are programmed in a language that the radeon cards can run. Cause they can.
To say that they "should" be able to when programmed correctly is more accurate. Let's see how a 5870 fares with DX11 and Physics programmed through OpenCL fares before we claim a Radeon card can or cannot run these effects. I think they can too, but how easy/difficult is it to get them coded for properly? We'll see soon enough I suppose.
 

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
Nvidia is just trying to squeeze as much profit as possible from their acquisition of Ageia. Once games begin to use the various open standards for GPU-assisted physics, PhysX won't matter - if developers are going to spend the time and money to implement hardware physics, they'll choose an API that works with both ATI and Nvidia cards. Same reason that 3dfx Glide died. Proprietary APIs just don't seem to survive for long.

So Nvidia wants to use PhysX as a sales argument until the very end. They essentially paid the full price for getting hardware physics noticed. Without Nvidia pushing PhysX, I'm not sure other hardware physics APIs would have emerged so soon. Ageia's own PhysX cards were not exactly taking the gaming market by storm.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,606
134
106
Originally posted by: Keysplayr
Originally posted by: GaiaHunter
Originally posted by: BenSkywalker
Why would the issue be the CPU? Most of the effects in Batman seem to be possible with previous tech such as the fog effects and stuff like caution tape that breaks when you cross it. I think there was also some extra glass breaking.
It is the difference between a normal graphics effect and a physics based one. The fog rolling off of your body in Batman is a physics based effect, the fog by itself is a simple graphics effect. The papers being tossed around at your feet when you walk over them is a physics based effect, debri randomly flying around is a simple graphics effect. That the 58xx parts can very, very easily render the visuals involved is not in question in any way, but handling the calculations to have those graphic effects interact with the physical world around them is a task that must currently fall on processors in AMD equipped systems.

Don't get me wrong, it increases the immersion of the player into the game but these effects seem possible before PhysX.
They are absolutely possible, but they are extremely processor intensive. This is where something like PhysX helps out, GPUs have more then an order of magnitude more power at handling these calculations then CPUs do. This could easily be done by another API, someone creating something under OpenCL it the most likely solution, but there is no doubt that GPUs will remain vastly superior to CPUs for physics based calculations for a long while, particularly when comparing them to desktop x86 offerings.

Also, anyone who wants full eye candy usually has a tricked out system anyways so needing a more powerful CPU vs needing an extra video card for PhysX isn't going to be much of a barrier for those who really want the extra effects. I just question whether the effects were possible only with PhysX like nVidia claims.
You can turn them on if you'd like. The numbers I have seen indicate that a 9800GT will be faster with PhysX on then a i7 and 5870, by a decent amount too. The calculations are too intensive for the FP capabilities of current processors.
I hope you aren't trying to imply that the Radeon cards can't do those physics effects if they are programmed in a language that the radeon cards can run. Cause they can.
To say that they "should" be able to when programmed correctly is more accurate. Let's see how a 5870 fares with DX11 and Physics programmed through OpenCL fares before we claim a Radeon card can or cannot run these effects. I think they can too, but how easy/difficult is it to get them coded for properly? We'll see soon enough I suppose.
Considering both nvidia and ati demoed their graphics cards doing physX when Ageia came out (I remember 3 X1950XTX for example), I would say it isn't rocket science.

How fast they compare, is a bit hard to say, though.

Although it would be fun if, for example and if ATI had adhered to physX, ATI cards performed well and then a new driver release to physX would make them perform worse...

You can imagine the rukus in places like this forums :)

My post was just pointing that there is no known hardware limitations that would/will prevent AMD/ATI cards to produce physics effects.
 

Schmide

Diamond Member
Mar 7, 2002
5,375
292
126
Originally posted by: cusideabelincoln
Holy crap, I think this is the first poster to correctly deduce my username. After using this name for damn near a decade... all I have to say is wow! Most people at least need a hint, and the vast majority need a complete explanation.
To be honest, until the question was asked, I had always read it without really looking at it as

cusidea be lincoln
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Schmide
Originally posted by: cusideabelincoln
Holy crap, I think this is the first poster to correctly deduce my username. After using this name for damn near a decade... all I have to say is wow! Most people at least need a hint, and the vast majority need a complete explanation.
To be honest, until the question was asked, I had always read it without really looking at it as

cusidea be lincoln
I always just thought it read "cause I the Abe Lincoln" cus i de Abe Lincoln.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: GaiaHunter
Originally posted by: Keysplayr
Originally posted by: GaiaHunter
Originally posted by: BenSkywalker
Why would the issue be the CPU? Most of the effects in Batman seem to be possible with previous tech such as the fog effects and stuff like caution tape that breaks when you cross it. I think there was also some extra glass breaking.
It is the difference between a normal graphics effect and a physics based one. The fog rolling off of your body in Batman is a physics based effect, the fog by itself is a simple graphics effect. The papers being tossed around at your feet when you walk over them is a physics based effect, debri randomly flying around is a simple graphics effect. That the 58xx parts can very, very easily render the visuals involved is not in question in any way, but handling the calculations to have those graphic effects interact with the physical world around them is a task that must currently fall on processors in AMD equipped systems.

Don't get me wrong, it increases the immersion of the player into the game but these effects seem possible before PhysX.
They are absolutely possible, but they are extremely processor intensive. This is where something like PhysX helps out, GPUs have more then an order of magnitude more power at handling these calculations then CPUs do. This could easily be done by another API, someone creating something under OpenCL it the most likely solution, but there is no doubt that GPUs will remain vastly superior to CPUs for physics based calculations for a long while, particularly when comparing them to desktop x86 offerings.

Also, anyone who wants full eye candy usually has a tricked out system anyways so needing a more powerful CPU vs needing an extra video card for PhysX isn't going to be much of a barrier for those who really want the extra effects. I just question whether the effects were possible only with PhysX like nVidia claims.
You can turn them on if you'd like. The numbers I have seen indicate that a 9800GT will be faster with PhysX on then a i7 and 5870, by a decent amount too. The calculations are too intensive for the FP capabilities of current processors.
I hope you aren't trying to imply that the Radeon cards can't do those physics effects if they are programmed in a language that the radeon cards can run. Cause they can.
To say that they "should" be able to when programmed correctly is more accurate. Let's see how a 5870 fares with DX11 and Physics programmed through OpenCL fares before we claim a Radeon card can or cannot run these effects. I think they can too, but how easy/difficult is it to get them coded for properly? We'll see soon enough I suppose.
Considering both nvidia and ati demoed their graphics cards doing physX when Ageia came out (I remember 3 X1950XTX for example), I would say it isn't rocket science.

How fast they compare, is a bit hard to say, though.

Although it would be fun if, for example and if ATI had adhered to physX, ATI cards performed well and then a new driver release to physX would make them perform worse...

You can imagine the rukus in places like this forums :)

My post was just pointing that there is no known hardware limitations that would/will prevent AMD/ATI cards to produce physics effects.
I see what you're pointing out. And at the same token, if there aren't any limitations, I think we would have seen physics on ATI hardware implemented a long time ago.
Whether it be Havok, Bullet, whatever. Even ATI's own. Nobody pursued it. Nobody made it happen. Maybe they just weren't ready, or just waiting on OpenCL. Who knows.

And, there will always be a ruckus over anything and everything. It's like Manchester United in here sometimes. :)
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Keysplayr
To say that they "should" be able to when programmed correctly is more accurate. Let's see how a 5870 fares with DX11 and Physics programmed through OpenCL fares before we claim a Radeon card can or cannot run these effects. I think they can too, but how easy/difficult is it to get them coded for properly? We'll see soon enough I suppose.
Of course Keys, Fermi "should" be able to run physics when programmed correctly too. The question is how easy or difficult it will be to do that. How much of a performance hit (if any) will nV hardware incur when programming for OpenCL instead of PhysX. Remember in the early accelerated 3D days, Glide games ran faster than Direct3D on 3DFx hardware. Glide had much better developer support than PhysX has now, but it still disappeared in the end.

A year from now, PhysX will most likely be done with. So your worry about ATI hardware running DX11, OpenCL, physics etc also applies to NVidia. I guess we'll know next year when Fermi is released.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,606
134
106
Originally posted by: Keysplayr

I see what you're pointing out. And at the same token, if there aren't any limitations, I think we would have seen physics on ATI hardware implemented a long time ago.
Whether it be Havok, Bullet, whatever. Even ATI's own. Nobody pursued it. Nobody made it happen. Maybe they just weren't ready, or just waiting on OpenCL. Who knows.
I don't agree that we would have seen something or that not seeing physics on ATI hardware means anything other then AMD/ATI waiting for a vendor agnostic implementation.

In reality, only now is there a title that uses physX that draw some interest.

They can wait.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Kuzi
Originally posted by: Keysplayr
To say that they "should" be able to when programmed correctly is more accurate. Let's see how a 5870 fares with DX11 and Physics programmed through OpenCL fares before we claim a Radeon card can or cannot run these effects. I think they can too, but how easy/difficult is it to get them coded for properly? We'll see soon enough I suppose.
Of course Keys, Fermi "should" be able to run physics when programmed correctly too. The question is how easy or difficult it will be to do that. How much of a performance hit (if any) will nV hardware incur when programming for OpenCL instead of PhysX. Remember in the early accelerated 3D days, Glide games ran faster than Direct3D on 3DFx hardware. Glide had much better developer support than PhysX has now, but it still disappeared in the end.

A year from now, PhysX will most likely be done with. So your worry about ATI hardware running DX11, OpenCL, physics etc also applies to NVidia. I guess we'll know next year when Fermi is released.
OpenCL, according to many articles we've read, is very similar to OpenCL albeit a bit more advanced. According to those articles (which I can find if you haven't seen them) it's actually a trivial thing to modify the CUDA programming language to OpenCL. It's actually already done as you can see OpenCL drivers are available for Win7. PhysX won't be going away. 3Dfx was financially destitute and was consumed. Nvidia doesn't seem to be in that sort of trouble. 7.5billion market cap, almost a billion of it in cash. There is another company though that actually made the top 20 most likely to go bankrupt in 2010. And it doesn't start with an N. I'm not saying this to aggravate you. This is common knowledge to all. Just referencing other sources and you brought up 3Dfx going under.
PhysX will run atop OpenCL is all.
Do you expect Havok, or Bullet physics to dissapear because OpenCL arrives?
A year from now may be nothing like what we see today. In fact in may look a bit alien to us. VERY large changes are happening all across the industry. Should make for some very interesting conversations.


 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: GaiaHunter
Originally posted by: Keysplayr

I see what you're pointing out. And at the same token, if there aren't any limitations, I think we would have seen physics on ATI hardware implemented a long time ago.
Whether it be Havok, Bullet, whatever. Even ATI's own. Nobody pursued it. Nobody made it happen. Maybe they just weren't ready, or just waiting on OpenCL. Who knows.
I don't agree that we would have seen something or that not seeing physics on ATI hardware means anything other then AMD/ATI waiting for a vendor agnostic implementation.

In reality, only now is there a title that uses physX that draw some interest.

They can wait.
You could be very right. It could also be the exact reverse. We don't know yet. But like I said, all will become clearer in a few months after Win7 retail gets out there and established. Heck, I bet more people run Win7 RC than they do Vista. LOL.

 

waffleironhead

Diamond Member
Aug 10, 2005
6,553
88
101
Originally posted by: cusideabelincoln
Originally posted by: waffleironhead
Originally posted by: SirPauly
Originally posted by: SlowSpyder Yea, maybe AMD should have taken a page out of Nvidia's book regarding their 8800 -> 9800 transition. Now that's innovation! :thumbsup: :D
They did and were very similar actually!

When ATI moved to the Rv-670 -- they offered similar performance to the R-600 but with much less price-points.

When nVidia moved to the G-92 -- they offered similar performance to the G-80 but with much less price-points.
I think you missed the sarcasm...

He wasn't being sarcastic.

With the transition from the HD2000 to the HD3000 series, AMD actually significantly lowered power consumption and lowered price by making the GPUs on a smaller manufacturing process.

The 9800GT is simply a rebadged 8800GT. There is no difference between the cards at all, besides the name. Likewise can be said for the 512MB 9800GTX+ to GTS250 transition: You weren't guaranteed to get a GTS250 with the new board and lower power consumption and thus some cards were simply re-badged. And now OEMs can have the GTS240, which is a rebadge 9800GT which is a rebadged 8800GT.

edit: why is my text italicized?... Strange that the quote feature left an unclosed markup, because I definitely didn't erase it.
You as well missed the point. nV was saying that unlike their ongoing innovation AMD just keeps shrinking

"We are confident that when the Fermi-based GeForce products ship that we will maintain our performance crown. Even today our GeForce GTX 285 significantly outperforms the HD 5870 in next gen games like Batman Arkham Asylum with physics. In fact, even our GeForce GTS 250 outperforms it. That's fundamentally the difference between us and AMD. We focus on graphics plus cool effects like physics and 3D stereo, whereas AMD just makes incremental changes to their graphics cards."

he was being sarcastic. shrinks are not innovation.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Features-wise I consider them even, for stuff that matters. Eyefinity looks cool, but I would think most people only have enough $$$ to go for 1 monitor setup. Surely enough, there are no DX11 games out there now, just like there are no more than handful games that have physx on them. Not interested in Batman (didn't even like the movie), but Cryostasis is something I would like to try sometime soon.

The only thing 'cool' I tried was the 3D vision, they had it (with NV logo and all) hooked up to a large TV (60'' IIRC) at the local fry's. I have spent about 10 minutes on it switching back and forth between on/off and my verdict on it was a total meh, thoroughly unimpressive. Goggles were as clunky as ever, and my friend with glasses couldn't even use them. No way I am buying a videocard over another for that gimmick - if that's all they got to offer; hopefully it works better on actual games.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Originally posted by: Keysplayr
Originally posted by: Astrallite
In other words, NVIDIA innovation: Batman, Arkham Asylum.
Is that all you think they've done in the last 3 years? :D
I see you fold. But I would wager 90%+ of gamers could care less, let alone half of them knowing what it is to begin with. I don't see anyone (read: gamers, the majority that buy these discrete video cards) really putting CUDA to a good use. I guess it's about as useless as DX11 is at this point.

Physx is probably about the only thing that **might** matter, and again at the time being it is far from being the decisive factor to the most.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: konakona
Originally posted by: Keysplayr
Originally posted by: Astrallite
In other words, NVIDIA innovation: Batman, Arkham Asylum.
Is that all you think they've done in the last 3 years? :D
I see you fold. But I would wager 90%+ of gamers could care less, let alone half of them knowing what it is to begin with. I don't see anyone (read: gamers, the majority that buy these discrete video cards) really putting CUDA to a good use. I guess it's about as useless as DX11 is at this point.

Physx is probably about the only thing that **might** matter, and again at the time being it is far from being the decisive factor to the most.
Nvidia isn't "just" about gaming anymore. And when I say that, it doesn't "lessen" their gaming focus. They sought to augment their product so it could penetrate a much wider market than just gaming. They now practically have a supercomputer on a chip.

You mentioned Batman:AA, and Folding. Millions fold world wide that don't have a clue about gaming either. They just know they can help fight cancer, faster, if they use GPU's in addendum to their CPU's. I'm not going to go through the whole list man. That would take hours. Just visit Nvidia's site in the CUDA section. All I'm sayin is, "It 'aint just Batman and Folding."
I will totally agree with you "now" that gamers won't put CUDA to good use. But Microsoft will "for" them with Win7 and beyond.. Direct Compute. OpenCL. Soon, it may not even be a conscious thing by the user to utilize GPU power.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: konakona
Features-wise I consider them even, for stuff that matters. Eyefinity looks cool, but I would think most people only have enough $$$ to go for 1 monitor setup. Surely enough, there are no DX11 games out there now, just like there are no more than handful games that have physx on them. Not interested in Batman (didn't even like the movie), but Cryostasis is something I would like to try sometime soon.

The only thing 'cool' I tried was the 3D vision, they had it (with NV logo and all) hooked up to a large TV (60'' IIRC) at the local fry's. I have spent about 10 minutes on it switching back and forth between on/off and my verdict on it was a total meh, thoroughly unimpressive. Goggles were as clunky as ever, and my friend with glasses couldn't even use them. No way I am buying a videocard over another for that gimmick - if that's all they got to offer; hopefully it works better on actual games.
So all you looked at was the 3D test built into the 3DStereovision driver? They didn't allow anyone to try any games? That sux.

 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Originally posted by: Keysplayr
Originally posted by: konakona
Features-wise I consider them even, for stuff that matters. Eyefinity looks cool, but I would think most people only have enough $$$ to go for 1 monitor setup. Surely enough, there are no DX11 games out there now, just like there are no more than handful games that have physx on them. Not interested in Batman (didn't even like the movie), but Cryostasis is something I would like to try sometime soon.

The only thing 'cool' I tried was the 3D vision, they had it (with NV logo and all) hooked up to a large TV (60'' IIRC) at the local fry's. I have spent about 10 minutes on it switching back and forth between on/off and my verdict on it was a total meh, thoroughly unimpressive. Goggles were as clunky as ever, and my friend with glasses couldn't even use them. No way I am buying a videocard over another for that gimmick - if that's all they got to offer; hopefully it works better on actual games.
So all you looked at was the 3D test built into the 3DStereovision driver? They didn't allow anyone to try any games? That sux.
Right. I am not saying 3dvision is useless, I havn't seen what it can be at home with the games that I am familiar with. They (Fry's, or whoever that runs the ad campaign) really could have done a better job though.
 

konakona

Diamond Member
May 6, 2004
6,285
1
0
Originally posted by: Keysplayr
Originally posted by: konakona
Originally posted by: Keysplayr
Originally posted by: Astrallite
In other words, NVIDIA innovation: Batman, Arkham Asylum.
Is that all you think they've done in the last 3 years? :D
I see you fold. But I would wager 90%+ of gamers could care less, let alone half of them knowing what it is to begin with. I don't see anyone (read: gamers, the majority that buy these discrete video cards) really putting CUDA to a good use. I guess it's about as useless as DX11 is at this point.

Physx is probably about the only thing that **might** matter, and again at the time being it is far from being the decisive factor to the most.
Nvidia isn't "just" about gaming anymore. And when I say that, it doesn't "lessen" their gaming focus. They sought to augment their product so it could penetrate a much wider market than just gaming. They now practically have a supercomputer on a chip.

You mentioned Batman:AA, and Folding. Millions fold world wide that don't have a clue about gaming either. They just know they can help fight cancer, faster, if they use GPU's in addendum to their CPU's. I'm not going to go through the whole list man. That would take hours. Just visit Nvidia's site in the CUDA section. All I'm sayin is, "It 'aint just Batman and Folding."
I will totally agree with you "now" that gamers won't put CUDA to good use. But Microsoft will "for" them with Win7 and beyond.. Direct Compute. OpenCL. Soon, it may not even be a conscious thing by the user to utilize GPU power.
I am actually quite excited about the their Tegra stuff, probably not alone at that. They should make an i-pod killer with that.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,606
134
106
Originally posted by: Keysplayr

Nvidia doesn't seem to be in that sort of trouble. 7.5billion market cap, almost a billion of it in cash.
That is true but money can be lost quite fast. The loss of the chipset market won't do any good either.


There is another company though that actually made the top 20 most likely to go bankrupt in 2010. And it doesn't start with an N. I'm not saying this to aggravate you. This is common knowledge to all. Just referencing other sources and you brought up 3Dfx going under.
I still think it isn't that easy for AMD just disappear - their technology is valuable and that wouldn't mean ATI split from AMD.

Most likely, AMD will still be up for a few years.

If it doesn't, well, I'm not sure how much would NVIDIA benefit - the juggernaut is still there and I bet it doesn't like the juggernaut-like manners of NVIDIA.


PhysX will run atop OpenCL is all.
Do you expect Havok, or Bullet physics to dissapear because OpenCL arrives?
A year from now may be nothing like what we see today. In fact in may look a bit alien to us. VERY large changes are happening all across the industry. Should make for some very interesting conversations.
I guess people are just expecting, considering the devs are so lazy they even have to phone AMD or NVIDIA to implement AA :p, that if the devs have a path that works for both vendors, they will choose that one instead of a vendor specific API.




 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Keysplayr
OpenCL, according to many articles we've read, is very similar to OpenCL albeit a bit more advanced. According to those articles (which I can find if you haven't seen them) it's actually a trivial thing to modify the CUDA programming language to OpenCL. It's actually already done as you can see OpenCL drivers are available for Win7. PhysX won't be going away. 3Dfx was financially destitute and was consumed.
Ok, I assume you meant CUDA and OpenCL are similar, maybe they are, I'm not sure. CUDA is proprietary (closed) OpenCL is not. OpenCL drivers available to Win7, what does that have to do with CUDA?

Nvidia doesn't seem to be in that sort of trouble. 7.5billion market cap, almost a billion of it in cash. There is another company though that actually made the top 20 most likely to go bankrupt in 2010. And it doesn't start with an N. I'm not saying this to aggravate you. This is common knowledge to all. Just referencing other sources and you brought up 3Dfx going under.
I was talking about Glide disappearing, not 3DFx or nV for that matter. After years of Glide developer support, everyone moved to OpenGL/DirectX, because proprietary standards like Glide, PhysX etc are dead ends. I never said NVidia was going under, I don't think so at all.

As for AMD/ATI going bankrupt, if it happens, people will have to do with buying Fermi cards for $1000.

PhysX will run atop OpenCL is all.
Do you expect Havok, or Bullet physics to dissapear because OpenCL arrives?
A year from now may be nothing like what we see today. In fact in may look a bit alien to us. VERY large changes are happening all across the industry. Should make for some very interesting conversations.
PhysX will run atop OpenCL? So if ATI, Intel and S3 have supporting OpenCL drivers, does that mean PhysX will run on their hardware as well? If it does, then I agree, PhysX may live on.

Bullet Physics will run on OpenCL and DirectX11, open standards that support all hardware. But it does not matter if developers use Bullet Physics engine, Havoc, or something else, what matters is that whatever physics engine they decide to use runs through OpenCL and DX11.

Will Havok and Bullet physics disappear, I don't know, and I don't care, let the developers decide which engine they prefer to use for physics.
 

Shmee

Memory and Storage, Graphics Cards
Super Moderator
Sep 13, 2008
5,349
997
126
talking about feature sets, has ATI come up with something that folds well yet? Just wondering.
 

ASK THE COMMUNITY