Possible AMD support for CUDA coming

Mem

Lifer
Apr 23, 2000
21,476
13
81
I thought the comment below the statement was interesting.


Ben - Let's be very clear - AMD is NOT adopting Nvidia's closed, proprietary development environment that only works on their GPUs.

We ARE continuing our long standing practice of supporting industry standards that free developers to target all hardware platforms - and it's worth noting that OpenCL is for both CPUs and GPUs and our upcoming SDK will support both, so we aren't driving developers toward one architecture or the other, unlike our competitors.

Again, developers should be free to target the processor architecture that best supports their application and ATI Stream technology from AMD give them exactly that. Thx. for listening! Best, Gary
.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
So far Havok is not getting off the ground, so this may be the start of AMD not only adopting CUDA but PhysX.

With more games supporting it, they may not have much choice.

I really can't see any other reason for AMD to adopt CUDA.

Although there are something like 100 applications that now support CUDA.
http://www.nvidia.com/object/cuda_home.html#
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
With OpenCL 1.0 already released, Havok physics GPU acceleration will start picking up in the near future, especially when Intel releases Larrabee.

CUDA will reach a dead end. It reminds me of the 3DFX days with their Glide API, was good for a little while, until standard APIs like Direct3D and OpenGL caught up. Nvidia will have no choice but to dump their proprietary CUDA and move on with everyone else.

It also reminds me of the first Nvidia 3D chip, the NV1, and it's use of "proprietary quadratic texture mapping". Had pretty good 3D performance for the time, I had one of those Diamond Edge cards. But...

"Although the NV1 was technologically superior to other chips of that era from a number of perspectives, the proprietary quadratic texture mapping of the NV1 was its death sentence. When Microsoft finalized Direct3D not too long after the NV1 had reached store shelves, polygons had been chosen as the standard primitive, and despite NVIDIA's and Diamond's best efforts, developers were no longer willing to develop for the NV1."

This is from an article at Firing Squad

OpenCL is already finalized, and I see a repeat in history again.
 

reallyscrued

Platinum Member
Jul 28, 2004
2,618
5
81
Originally posted by: Kuzi

OpenCL is already finalized, and I see a repeat in history again
.

Yay for ATI users! Does that mean I will be able to play Cryostasis on my 4890 faster than 11 fps?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Kuzi


OpenCL is already finalized, and I see a repeat in history again.

OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead. So I guess it depends on which history you look at.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: reallyscrued
Originally posted by: Kuzi

OpenCL is already finalized, and I see a repeat in history again
.

Yay for ATI users! Does that mean I will be able to play Cryostasis on my 4890 faster than 11 fps?

Depends on how ATI hardware handles OpenCL.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
Originally posted by: Wreckage
Originally posted by: Kuzi


OpenCL is already finalized, and I see a repeat in history again.

OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead. So I guess it depends on which history you look at.

How about we look at the history of vendor specific versus vendor neutral API's. OpenGL/DirectX all but eliminated GLide/RRedline/MSI/....ect ect
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nismotigerwvu
Originally posted by: Wreckage
Originally posted by: Kuzi


OpenCL is already finalized, and I see a repeat in history again.

OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead. So I guess it depends on which history you look at.

How about we look at the history of vendor specific versus vendor neutral API's. OpenGL/DirectX all but eliminated GLide/RRedline/MSI/....ect ect

The thing is PhysX has zero competition right now on the GPU front. This may hold true for a long time.

Not to mention that Havok is also proprietary. '

As for OpenCL it will run slower than CUDA on NVIDIA cards which right now hold the majority of the market. Although the same can be said for Brook+
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: Wreckage
Originally posted by: Kuzi

OpenCL is already finalized, and I see a repeat in history again.

OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead. So I guess it depends on which history you look at.

OpenGL is more than a decade old, and for some time it was more efficient than DirectX, many developers preferred it over DirectX. But DirectX caught up and I guess is better now, so almost everyone uses it instead of OpenGL.

Video card vendors support OpenGL and DirectX, so a developer doesn't need to worry about their program/game not running on certain hardware, whether they use DirectX or OpenGL.

CUDA is different because the developer knows if they develop for CUDA, it will only run on NV hardware. If they develop for DirectX, OpenGL, or OpenCL, it will run on NV, ATI and Intel hardware.

As to which hardware will be better for OpenCL, we'll have to wait and see. I think Larrabee will do better at certain stuff, NV better at others.

Originally posted by: Wreckage
The thing is PhysX has zero competition right now on the GPU front. This may hold true for a long time.

Not to mention that Havok is also proprietary. '

As for OpenCL it will run slower than CUDA on NVIDIA cards which right now hold the majority of the market. Although the same can be said for Brook+

OpenCL programming will be used to accelerate Havok Physics, and it runs on all hardware, not just NV hardware.

Yes OpenCL may run slower than CUDA, but NVidia will have no choice, it's the developers that choose which they want to support. Maybe certain software can have both, a CUDA path and an OpenCL path, like in the past, certain software had both OpenGL and DirectX paths.

Direct3D ran slower than Glide on 3DFX cards, but in the end all Glide support was dropped in favor of DirectX.

IIRC, 3DFX had the majority of the market at the time too, that didn't make make any difference.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead
OpenGL and DirectX are both "open" in the sense that any company can adopt them. OpenCL and DirectX 11 are also open in that same regard. CUDA/PhysX is the odd man out; it only works on 1 company's hardware. CUDA is a lot closer to Glide than it is to DirectX, and that's exactly why it will go away.

I'm going to bet that the winning standard will be DirectX's GPGPU. Getting third party support is one of those things Micosoft is really good at doing. They pulled DirectX out of their asses and it managed to overtake Glide and OpenGL. Even professional software like AutoCAD and Photoshop now support DirectX. I have no doubt in my mind that they'll be just as successful with DirectX GPGPU.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
I do not know the capabilities of DirectX11 in terms of parallel computing, but I agree it will probably take over in the end (DX12, DX13 etc). OpenCL would still live on on Apple systems (like OpenGL) because Mac OS X obviously doesn't have DirectX, but supports OpenCL.
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
Originally posted by: ShawnD1
OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead
OpenGL and DirectX are both "open" in the sense that any company can adopt them. OpenCL and DirectX 11 are also open in that same regard. CUDA/PhysX is the odd man out; it only works on 1 company's hardware. CUDA is a lot closer to Glide than it is to DirectX, and that's exactly why it will go away.

CUDA/Physx is "open" in the sense any company can adopt it too. ATI chose to not adopt it because they know they're far behind in GPGPU and would underperform compared to NVIDIA. They're basically holding the industry hostage while they play catchup.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I am really hoping that OpenCL becomes the standard, it is just better for everyone that way. I am also hoping OpenGL 3.1 will start making people who switched to directx take another look , again it would be better for everyone.


If you put the 3 Open api together, OPenCL for physics, OpenGL for graphics, OpenAL for sound, you have a winning combination that would work on every platfrom and cpu/gpu from consoles to pc to OSX.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: AyashiKaibutsu
Originally posted by: ShawnD1
OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead
OpenGL and DirectX are both "open" in the sense that any company can adopt them. OpenCL and DirectX 11 are also open in that same regard. CUDA/PhysX is the odd man out; it only works on 1 company's hardware. CUDA is a lot closer to Glide than it is to DirectX, and that's exactly why it will go away.

CUDA/Physx is "open" in the sense any company can adopt it too. ATI chose to not adopt it because they know they're far behind in GPGPU and would underperform compared to NVIDIA. They're basically holding the industry hostage while they play catchup.

While ATI could adopt CUDA, licensing fees are involved. What's to stop Nvidia from changing the fees once a CUDA market has been established? Microsoft won't do that since they want developers to stick with their API for various reasons (and DirectX is currently free). SGI's website says to contact Microsoft about OpenGL licenses, so I assume that means SGI has little to no control over the licensing fees (Microsoft has them by the balls?). Nvidia, in contrast to Microsoft's DirectX and SGI's OpenGL, offers licensing with not much assurance that they won't screw you on the cost.

As for being behind in GPGPU, how did you come to that conclusion? If I'm not mistaken, ATI's strong point in most games is shaders. ATI puts a lot of work into GPU shaders and they've been ahead of Nvidia for a long time because of it. As an example of this, F@H has had an ATI client since the Radeon X1*** series, but Nvidia only gained support since the GeForce 8 series. How long is that gap between ATI having working shaders and Nvidia having working shaders? Like 2 years?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Wreckage

OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead.
Uh, what? OpenGL is an evolving standard and it?s no more finalized than Direct3D is.

Not only is the core spec constantly moving forward, but IHVs implement their own extensions as new features are made available due to new hardware.

OpenGL will never be finalized unless development of the API itself freezes.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
Originally posted by: Wreckage

OpenGL has been finalized for several years and yet most games use the proprietary DirectX instead.
Uh, what? OpenGL is an evolving standard and it?s no more finalized than Direct3D is.

Not only is the core spec constantly moving forward, but IHVs implement their own extensions as new features are made available due to new hardware.

OpenGL will never be finalized unless development of the API itself freezes.

"Finalized" in the same way that OpenCL is. In other words not in beta. :roll:
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: ShawnD1

Microsoft won't do that since they want developers to stick with their API for various reasons (and DirectX is currently free).

As long as you run it on windows, which makes it closed. They don't offer a version for Linux\Mac.

It's funny how people are blind to this.

Now one could argue that because Microsoft has a majority of the market it's not a big deal. However, NVIDIA also has a majority of the market so the same argument could be made.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Wreckage
Now one could argue that because Microsoft has a majority of the market it's not a big deal. However, NVIDIA also has a majority of the market so the same argument could be made.

Microsoft has literally 100% of the games market while Nvidia only has 66%. Saying that DirectX is restrictive in some way is like saying that speaking English restricts me to only speak to 100% of the people in my country.

Or developers could pick CUDA because it allows them to port Crysis to Apple computers that have integrated graphics and wouldn't be able to run the game anyway. Or maybe they'll make a Linux client that will be too unpopular for retail stores to carry and won't be available on Steam because Steam is a Windows-only download service. Come on, let's get serious.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: ShawnD1


Microsoft has literally 100% of the games market while Nvidia only has 66%. Saying that DirectX is restrictive in some way is like saying that speaking English restricts me to only speak to 100% of the people in my country.
You can game on a Mac and Linux or even a Wii or PS3.

Or developers could pick CUDA because it allows them to port Crysis to Apple computers that have integrated graphics and wouldn't be able to run the game anyway. Or maybe they'll make a Linux client that will be too unpopular for retail stores to carry and won't be available on Steam because Steam is a Windows-only download service. Come on, let's get serious.
You are basically saying it's OK for Microsoft to have monopoly, but it's bad if CUDA does the exact same thing. :confused:

Hypocrisy.


The thing is CUDA was offered up to ATI. I don't see Microsoft offering it up to Linux or even Mac. So really you are in favor of the greater of 2 evils.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Wreckage
Originally posted by: ShawnD1

Microsoft has literally 100% of the games market while Nvidia only has 66%. Saying that DirectX is restrictive in some way is like saying that speaking English restricts me to only speak to 100% of the people in my country.
You can game on a Mac and Linux or even a Wii or PS3.
Mac? You mean it's actually possible to run Crysis on GeForce 9400M integrated graphics? It doesn't matter which GPGPU API you use on the Mac, it'll suck either way. Linux doesn't have any games either. Best Buy's website 0 linux products, Walmart's website has 18 linux products and all of them are books. Wii falls under the same category as Mac; the hardware is too slow to even think about CUDA or GPGPU. PS3 also has the Mac syndrome of having slow hardware - you can't run CUDA on a GeForce 7 (PS3 uses a GeForce 7). CUDA needs a GeForce 8 or better.

As you can see, the Windows PC is the only platform with games and GPGPU potential. You could write your entire game in DirectX video, sound, and GPGPU without really limiting yourself in any way. Porting it to Mac is not a concern since the game won't run fast enough, Linux is not a concern because nobody will buy it anyway, PS3 is so complicated that you would need to rewrite the whole damn game even if you used OpenGL to begin with, and porting it to Xbox 360 just means leaving out the GPGPU stuff while keeping the video and sound the same.

A somewhat appropriate analogy would be like saying that my car limits me to only driving where there are roads. Yes it's true that much of the world does not have roads, but I live and spend time in places that do have roads, so that doesn't matter. DirectX GPGPU limits you to Windows only, but Windows is the only one that has any games or strong commercial support, so who cares? If Linux had commercial support from companies like Autodesk and Adobe, one could make a strong argument for OpenCL or even CUDA. It would also help if Apple computers had the proper hardware to run any game made in the past 20 years. Unfortunately neither of those are true, so there's really nothing to lose by choosing to write your software with DirectX.



Or developers could pick CUDA because it allows them to port Crysis to Apple computers that have integrated graphics and wouldn't be able to run the game anyway. Or maybe they'll make a Linux client that will be too unpopular for retail stores to carry and won't be available on Steam because Steam is a Windows-only download service. Come on, let's get serious.
You are basically saying it's OK for Microsoft to have monopoly, but it's bad if CUDA does the exact same thing. :confused:

Hypocrisy.


The thing is CUDA was offered up to ATI. I don't see Microsoft offering it up to Linux or even Mac. So really you are in favor of the greater of 2 evils.
Bitching about a free API having a monopoly is nonsense. Do you really think programmers are strong armed into writing their games in DirectX? Does Microsoft have men in black suits threatening to kill people if they use OpenGL, OpenCL, or CUDA over DirectX? There's a reason VLC media player runs faster in DirectX than it does in OpenGL. There's a reason Google Earth is faster in DirectX than OpenGL. There's a reason World of Warcraft renders things incorrectly in OpenGL. In a Windows environment, DirectX is a better API. Those programmers can start using OpenGL or some other cross platform language any time they want, but they choose not to.

CUDA is a different matter. Depending on licensing from a rival company when other free alternatives are on the table (DirectX, OpenCL) is just crazy. Nobody does that. This would be on the same level as Microsoft making PS3 exclusive games.
"Buy our product even though it directly supports our competitors!"
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Wreckage
Originally posted by: ShawnD1

Mac? You mean it's actually possible to run Crysis on GeForce 9400M integrated graphics?

http://store.apple.com/us/product/TW387ZM/A

You clearly don't know what you're talking about, so I will just ignore you.

I love how you didn't even read your own link.
"System Requirements: Mac Pro"

I need a $3000 computer before I can buy a $450 video card on a platform that makes up 5% of the computer market? Developers must be idiots if they don't jump on the opportunity to develop for that. When I said it was a total waste of time to care about Mac compatibility, I was dead wrong.