Is ATI clueless about GPU computing ?

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I am not a person who cares who makes my hardware , I go for what does the most for the least money . A few months ago I went ahead and got a few ATI cards because I needed to do some work with tesselation. Now that the work is over I need to move on to other things and one of those is using some applications on several systems that implement GPU computing. I have waited patiently for ATI to catch up to Nvidia here. They keep making promises but I am not seeing anything come to fruitiion. So today I ended up having to pull 6 ATI cards and replace them with Nvida cards that support the applications. I held on not using some of these applications, I had hoped ATI would step up and show they cared about the professional market but seems that is not the case. I just couldn't wait any longer for ATI. There are too many current applications using cuda that I can benefit from rather than wait for something from ATI that may never exist.

Now I know some will say ATI has the stream sdk and it is the programmers fault for not using it and I thought that myself at first until I started asking around why the programs were not using ATI. The reason I got over and over was support. Nvidia has made it a priority to help developers use CUDA where ATI often lets emails and support request go unanswered. I really don't think they get it. When you have industry leaders like ILM supporting CUDA but not ATI or OpenCL that should be a wake up call . At siggraph last week Nvidia dominated ATI with CUDA. ATI had nothing to show of interest.

Some of the big CUDA apps:
Those 4 GPU are nvidia cards.
The emphasis this year will be V-Ray RT GPUs. Developed on the basis of the V-Ray RT this outstanding technology inherits the great interactivity and flexibility of the RT engine and brings it into the GPUs to achieve 15-20 times faster rendering process and lighting and shading set up.

V-Ray RT on GPUs will be presented on the new 3DBOXX Extreme Edition with 4 GPUs provided with the kind of support of BOXX Technologies.
Octane Render
http://www.refractivesoftware.com/

iray
http://www.mentalimages.com/products/iray

shatter
http://www.nshatter.com/index.html

Arion
http://randomcontrol.com/arion

Indigo renderer
http://www.indigorenderer.com/

pflow
http://www.zhangy.com/main/blog/id/20

3dcoat
http://www.3d-coat.com/3d-coat-33/

bullet
http://code.google.com/p/bullet/

Adobe has the mercury engine in premiere and after effect.
Sapphire 5 uses NVIDIA's CUDA system for GPU acceleration.

nuke
http://www.thefoundry.co.uk/

ilm sums it up nicely
ILM moved forward using CUDA to develop Plume, which is both a 3D fluid simulator/solver and a rendering application. They use mostly the Nvidia Quadro FX 5800 pro graphics cards because of the 4GB of memory. The technology is deployed both to individual artists workstations and a GPU based-renderfarm.

One of the things you can quantify, continues Maury, is that since CUDA and the Nivida Quadro pro graphics cards are being used as a general-purpose 3D solution, it is possible to compare the GPU to the CPU technology. Previously, “We'd have one to two iterations a day, maximum. With this solution we saw a ten-fold speed increase, and that was just on the simulation side, not even the rendering side. We went from these overnight simulation runs to having four, five, maybe six different versions a day of the same shot to show. One of the best quotes I heard from a supervisor was, ‘Hey, just do another version. Show me something else, something different.’ That was something we could not do before.”

“Using traditional CPU-based rendering technology,” says Spina, “you typically get one opportunity to render a full resolution simulation, because it takes so long. Thus, historically, it has stripped all the creativity away, because production schedules are so tight. Now, a single frame that used to take a week can be rendered in a single hour. Even pre-visualization scenes that traditionally are low-resolution are now closer to final renders because they can get that quality now.”
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,550
136
Sadly developer support is probably ATI's weakest area. You see it with Linux support. You see it with them on GPGPU. You see it on physics support.

While I believe personally believe that most consumers, even hardcore gamers, none of these areas I mentioned above are at a stage where the average consumer should care about them. Nevertheless it is still a fact that programmers and developers such as yourself will find nVidia better/easier to work with.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
It's no secret that nVidia has more resources than ATi. ATi being the graphics arm of AMD, it's not like they have the entire resources of AMD at their disposal (and AMD the cpu company has its own things to fund)

NV has performed great for the past few years, and they've been rewarded with a lot of marketshare and cash to show for it, which they use not only for PC gaming vidcard R&D but for fanning out to other areas as well, which includes strong dev relations, GPGPU, PhysX/buying Ageia, and Tegra.

ATi doesn't have those resources. With limited money you can spend, the best course of action is spend it where it counts (that is, where in the current period it seems ROI is greatest). That's what they are doing, focusing on video cards for gaming.

So it's not that ATi hates GPGPU. I know it doesn't change anything about your work, and knowing this won't make your ATi card suddenly develop software tools by itself and make it at GPGPU feature-parity with nV cards, but this is the answer to your question regarding "Is ATI clueless about GPU computing?".

They can only invest so much in it. The result is second-class to nVidia. ATi would love to offer you the exact same thing nV has, but with their limited resources, they have other priorities that take precedence. It's not that they are clueless, or they don't think there's a market. Their bean counters have done the math and the executives made a decision that, since ROI here isn't enough to invest the money needed to rival nV's offerings, they can only do what they are currently doing, despite being outclassed by nVidia as far as GPU computing is concerned.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Sadly developer support is probably ATI's weakest area. You see it with Linux support.
Yes, I definitely see it.

For basic 2D needs, it works good enough. Right now I'm typing this post on a Linux machine using FF, running with an ATi card. But if you actually want a modern ATI card (like my 4770) to do anything more (3D, etc), it's not as painless as GeForce cards (like my previous card, an 8600GTS).
 

coolVariable

Diamond Member
May 18, 2001
3,724
0
76
Pfft. You even see it with their windows drivers which are worthless!!!
Really? They still can't figure out how to program a driver so it doesn't cause issues with HDMI when waking from sleep?
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,550
136
Pfft. You even see it with their windows drivers which are worthless!!!
Really? They still can't figure out how to program a driver so it doesn't cause issues with HDMI when waking from sleep?

Actually with Windows they are fairly even with nVidia as far as gaming goes. I don't know about the HDMI issues as I don't use HDMI. But worthless? Wow. Flammatory. I can point to worse issues from nVidia drivers.

ATI's issues are mainly in budding areas or niche areas like Linux and GPGPU. They do have issues here and there with mainstream usage but they're largely equal with nVidia on that front.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
There are many factors. One is their hardware which is not as well equipped for GPGPU work. Two is focus, first they were promoting Brook then OpenCL and now DirectCompute. Their lack of focus diluted their program. Three, is most likely limited resources to spend on the project, AMD is a CPU company first and that is their priority.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Sadly developer support is probably ATI's weakest area. You see it with Linux support. You see it with them on GPGPU. You see it on physics support.

It is their corporate culture. You see it with AMD period. Compiler support for AMD cpus? They just sit back and hope the DOJ will force Intel to optimize their own compiler for AMD's chips.

ATI isn't the outlier, its the whole damn company that has always had this weird "build it and they will come" 'tude towards compiler/software/developer support (and marketing for that matter).
 
Mar 11, 2004
23,444
5,846
146
Seems to me that ATi isn't especially interested in that market right now, and so their support is lagging. I'm sure there's resource issues as well, and they seem to have their focus in the consumer space at the moment.

I'm curious if and how much things change with some of the forthcoming tech, namely integrating the GPU with the CPU. It would seem that they would need to have solid support in order for that to give them the performance edge that it should lead to. Then again, maybe they're relying on their superior GPU processing capability to give them the edge versus Intel's offerings.

I would think though, that since AMD's pro line of CPUs seems to be a pretty big part of their business, that they will at some point put some extra focus on mixing CPU and GPU processing to take advantage of the strengths of each. Right now, they might view it as competing with themselves and taking away from a major market for themselves if they were to introduce extensive GPU compute support.

My guess is, they're playing it cautiously for now, and possibly letting the market develop a lot more, and with the products they have coming in the future they will almost certainly be taking aim directly at this issue.

The last thing I wonder is, maybe AMD is considering a different approach. Since it would make sense for them to try to offer a more complete solution, they could be aiming to pair Opteron with GPU hardware that is more focused at that market, and this way they'll separate the professional and consumer sides . This could allow each one to be more flexible to meet the needs of either market. I would say its highly doubtful, since they're essentially moving in the complete opposite direction (integration).

Of course all of that is conjecture on my part, I don't have any direct insight to offer.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Actually with Windows they are fairly even with nVidia as far as gaming goes. I don't know about the HDMI issues as I don't use HDMI. But worthless? Wow. Flammatory. I can point to worse issues from nVidia drivers.

ATI's issues are mainly in budding areas or niche areas like Linux and GPGPU. They do have issues here and there with mainstream usage but they're largely equal with nVidia on that front.
Both sides have their issues, but in my experience the ATi driver is far far behind Nvidias.

Ive actually downgraded from a HD 4890 to a 8800GT and havent looked back. Due to the driver I actually display more FPS from the older, less powerfull 8800GT than the 4890. I will actually repeat that since I think its kinda funny and might make someone mad even though its 100% true and I have the benchmarks to back it up.

Under my settings, a 8800GT 512 diplays more FPS than ANY ATi card.

As for the topic, yes they could use a little more work such as actually releasing a runtime, but Jvoirg brought up a great point, which is ATi doesn't have the cash flow Nvidia has.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
It is their corporate culture. You see it with AMD period. Compiler support for AMD cpus? They just sit back and hope the DOJ will force Intel to optimize their own compiler for AMD's chips.

ATI isn't the outlier, its the whole damn company that has always had this weird "build it and they will come" 'tude towards compiler/software/developer support (and marketing for that matter).

This goes back to the Operton and K7 days. They relied on 3rd party chipsets for their server class CPU. Just amazing how they hope 3rd parties will pick up their slack.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Nvidia should be commended, they simply offer more than just their hardware. No matter what you think of 3D/PhysX/CUDA (and Ambient Occlusion in the driver, often left out) the fact that it's there and Nvidia give you the choice to use it is better than no choice at all- in most circumstances it leads to a more compelling gaming experience. For example the fact I can still use my GTS250 as a PhysX / folding / CUDA card when I upgrade extends the life and value of the card to me as the consumer.

I have found ATI drivers to be fine at what they do, but there's just more that could be done. Per game profiles should be in by now as should something 'extra' I noted Ambient Occlusion before and this is another example doing more than just games.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Both sides have their issues, but in my experience the ATi driver is far far behind Nvidias.

Ive actually downgraded from a HD 4890 to a 8800GT and havent looked back. Due to the driver I actually display more FPS from the older, less powerfull 8800GT than the 4890. I will actually repeat that since I think its kinda funny and might make someone mad even though its 100% true and I have the benchmarks to back it up.

Under my settings, a 8800GT 512 diplays more FPS than ANY ATi card.

As for the topic, yes they could use a little more work such as actually releasing a runtime, but Jvoirg brought up a great point, which is ATi doesn't have the cash flow Nvidia has.

I hope that you mean on Linux.
 

coolVariable

Diamond Member
May 18, 2001
3,724
0
76
Actually with Windows they are fairly even with nVidia as far as gaming goes. I don't know about the HDMI issues as I don't use HDMI. But worthless? Wow. Flammatory. I can point to worse issues from nVidia drivers.

ATI's issues are mainly in budding areas or niche areas like Linux and GPGPU. They do have issues here and there with mainstream usage but they're largely equal with nVidia on that front.

Sorry, but a GRAPHICS card company that has problems with a 7 year old standard (HDMI, specifically the EDID portion) SUCKS in my book.
Nothing flammatory about that.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I hope that you mean on Linux.
Nope, under windows. I purposely worded that a little weird. The maximum framerate an ATi card will allow on my monitor is 120hz. By changing things like the front porch I am able to "overclock" my monitor up to 140hz using Nvidia drivers, which is not possible with ATi.

While the 4890 undoubtably renders at a higher framerate, the 8800GT is ultimately displaying more frames per second.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Nope, under windows. I purposely worded that a little weird. The maximum framerate an ATi card will allow on my monitor is 120hz. By changing things like the front porch I am able to "overclock" my monitor up to 140hz using Nvidia drivers, which is not possible with ATi.

While the 4890 undoubtably renders at a higher framerate, the 8800GT is ultimately displaying more frames per second.

Yeah, I understand, but that's some serious downgrade, why you didn't get an GTX 260 instead?
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Nope, under windows. I purposely worded that a little weird. The maximum framerate an ATi card will allow on my monitor is 120hz. By changing things like the front porch I am able to "overclock" my monitor up to 140hz using Nvidia drivers, which is not possible with ATi.

While the 4890 undoubtably renders at a higher framerate, the 8800GT is ultimately displaying more frames per second.

Nonsense. On the ATI card you can attach 3 monitors, while I'm fairly sure the 8800 GT only supports a maximum of 2. 2X140= 280 "frames" per second, while 3 monitors at 120 on ATI would be 360 "frames".

Totally useless info besides the point, I just wanted to point out your error ;)


Also, on the subject of whether ATI or nVidia have less buggy drivers, there is a ton of speculation and personal experiences being related here but no good hard facts or statistics. Here is a valid statistic:
http://www.downloadsquad.com/2008/03/28/29-of-windows-vista-crashes-caused-by-nvidia-drivers/
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
Yup, as a developer I've had some pretty poor experiences with ATi's devrel.

ATi just doesn't seem to have the resources to get GPGPU going.
I'll just point out the Catalyst Release notes again:
OpenCL™ 1.0 conformance tests have not been run with this version of the ATI
Catalyst™ Driver Suite in conjunction with the ATI Stream SDK v2.2. If you require
an OpenCL™ 1.0 conformant driver, we recommend that you install the ATI
Catalyst™ 10.5 Driver Suite

So they didn't bother to even test if this driver works properly with the Stream SDK, and no guarantees on OpenCL.
Okay, that by itself is bad enough... but what REALLY pisses me off... instead of ATi hiring more developers/testers and getting their software department ready for prime-time GPGPU support... they invest in marketing and spread the following lies:

Our promise to nurture open industry standards
PC game developers have a right to the best possible gaming platform. AMD’s Gaming Evolved program confirms our promise to deliver the most innovative technologies, tools, and industry support to maintain the PC platform as world’s premier gaming environment. We will participate in the development and cultivation of OpenCL and OpenGL industry standards, and we will move quickly to move our innovations into the industry standards whenever feasible.

I'm just fed up with AMD. They've been hacking away at nVidia for their 'proprietary' Cuda/PhysX software for years now, and they kept promising us Stream, OpenCL, Havok, Bullet etc... and what do we have today? A set of untested OpenCL drivers, no apps, no physics, no nothing.

In the meantime, nVidia has been improving Cuda a lot, has supported developers very well, and also supports OpenCL, Bullet and whatnot. With nVidia you can have your cake and eat it too.
 

brybir

Senior member
Jun 18, 2009
241
0
0
Yup, as a developer I've had some pretty poor experiences with ATi's devrel.

ATi just doesn't seem to have the resources to get GPGPU going.
I'll just point out the Catalyst Release notes again:


So they didn't bother to even test if this driver works properly with the Stream SDK, and no guarantees on OpenCL.
Okay, that by itself is bad enough... but what REALLY pisses me off... instead of ATi hiring more developers/testers and getting their software department ready for prime-time GPGPU support... they invest in marketing and spread the following lies:



I'm just fed up with AMD. They've been hacking away at nVidia for their 'proprietary' Cuda/PhysX software for years now, and they kept promising us Stream, OpenCL, Havok, Bullet etc... and what do we have today? A set of untested OpenCL drivers, no apps, no physics, no nothing.

In the meantime, nVidia has been improving Cuda a lot, has supported developers very well, and also supports OpenCL, Bullet and whatnot. With nVidia you can have your cake and eat it too.

I think as others have pointed out that ATI is taking a cautious approach to HPC/GPGPU computing. As it stands their are many competing options for developers i.e. CUDA, OpenCL, various physics programs etc. I would imagine their thinking would be that once MS gets behind and starts releasing DirectCompute, they expect that it will carry industry momentum and the other options like CUDA will become interesting wikipedia articles.

I do not know what will happen, GPGPU is still in its infancy, and soon you will be able to buy a CPU+GPU combo that may have some interesting applications in render and server farms once the software to actively exploit the advantages of a GPU start to appear for even mainstream server workloads.

If MS pulls off DirectCompute well, and makes it integration with DirectX and MS API easy for developers, it is almost certain to gain significant traction. My guess, again, is that ATI started to develop their stream and OpenCL development, but know what MS has coming up in the future, and is going to get behind that.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I would imagine their thinking would be that once MS gets behind and starts releasing DirectCompute, they expect that it will carry industry momentum and the other options like CUDA will become interesting wikipedia articles.

DirectCompute was released as part of DX11 many months ago.

GPGPU is not in its infancy. It's been around for quite a few years, and GPUs can now process pretty much standard C++. I think it's quite mature.
Problem is, there's only one company currently supporting it: nVidia.
OpenCL and DC are in their infancy. Which is probably why developers are using Cuda instead (eg Adobe).
 
Last edited:

Cogman

Lifer
Sep 19, 2000
10,284
138
106
DirectCompute was released as part of DX11 many months ago.

GPGPU is not in its infancy. It's been around for quite a few years, and GPUs can now process pretty much standard C++. I think it's quite mature.
Problem is, there's only one company currently supporting it: nVidia.
OpenCL and DC are in their infancy. Which is probably why developers are using Cuda instead (eg Adobe).

Well, standardized GPGPU computing is in its infancy. There have been a couple of vendor specific GPGPU languages in the past. CUDA is by far the one that has become the most popular.

I have hope for OpenCL. If we are lucky, it will take off for both vendors.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I have hope for OpenCL. If we are lucky, it will take off for both vendors.

Problem with OpenCL is that it's barely as powerful as the initial Cuda release.
DirectCompute is possibly even worse.
They're still very basic C-ish languages (especially DC is pretty much standard HLSL), requiring a lot of special attention from the developer (lots of API calls to set up a context, compile objects etc).
If OpenCL is going to adapt as well as OpenGL, then I don't think it will ever get to the point where Cuda is today.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
It's not that strange how AMD wont pour resources into GPGPU. As a company not doing too well financially, it would be a pretty daft move to invest in a field were there is tough competition. I'm sure it's not ignorance from AMDs part, it's just that everything costs money and as a company bleeding dollar bills... well, it's not an odd call to keep on doing what they do best. Given time, AMD will -of course - adapt to the development and work more on GPGPU:ing, but not as long as their financial situation is as it is.

It's all about economics, really.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
It's not that strange how AMD wont pour resources into GPGPU. As a company not doing too well financially, it would be a pretty daft move to invest in a field were there is tough competition.

This part I would agree with... but the thing is that AMD itself might not be totally convinced of that.
AMD marketing has been boasting GPGPU features of their GPUs for quite a while now, and with their upcoming Fusion products, it would appear that they again want to market the GPGPU-side of things.