CUDA nVidia Exclusive?

tjcinnamon

Member
Aug 17, 2006
133
0
0
I am seriously considering picking up a GTX 295 sometime soon. I don't do a ton of gaming but I do a fair amount of video editing and multimedia.

Therefore the CUDA enabled technology is important to me.

I was looking that the HD4870's in a crossfire set up and they are a pretty powerful combo, although from what I have seen is they are still worse than the GTX 295.

One thing I noticed in the specs of the HD4870's is the amount of Stream Processors there are (700 per card). This is much much more than the GTX 295. From what I understand about the parallel processing is they use these stream processors to process therefore the more stream processors the better.

So my question is do you think that ATI cards will be able to take advantage of CUDA (even though its proprietary)? I know it could use OPENCL but CUDA seems like its going to be the most used API.

Are there drivers out that let Radeons "CUDA Enabled"? I think I saw some modded drivers that allow ATI's to use Phyx, but I could be mistaken.

Any thoughts? Thanks,
JOe K.
 

Klinky1984

Member
Nov 21, 2007
48
0
66
Why is CUDA important to you for multimedia? Are you going to be programming your own multimedia software? I am not personally aware of any worthwhile CUDA video editing apps(there is Badaboom, but that's conversion not editing and has it's own limitations).

CUDA is unique to nVidia. OpenCL is coming around, however, the bottom line is it's too early in the game to really designate which is going to be the dominate standard. CUDA is "popular" in the dev crowd, but hardly mainstream.

The stream processors used on the HD4850/4870 aren't quite the same as what you'd find on the GTXs, so you can't really compare them directly.

Perhaps you need to come up with an actual program you're going to use CUDA for and if the benefits are worth it(say it's 2 - 3x as fast using CUDA), then you'll have to grab an nVidia card. But I am not quite sure why you need CUDA for video editing.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
You can't go by shader count. ATI's shaders are vastly different from Nvidia's shaders. One architecture is different from the other.
To answer your question. No. It doesn't seem that CUDA on ATI cards will see any form of support thats noteworthy. F@H is used on ATI cards, but I'm not so sure that is CUDA. At any rate, the Nvidia architecture seems way more potent in almost anything else besides gaming. ATI tried to launch "Stream" a few months back, but I haven't heard much on it lately. May have fizzled.

There are more applications than Badaboom for encoding/transcoding.

Here is a quick list:

ArcSoft Total Media Theater ? DVD Player ? CUDA-accelerated DVD Upscaling... standard def to BluRay-like HD quality

· Badaboom ? CUDA transcoder

· CoreCodecCoreAVC ? CUDA -accelerated video encoder

· CyberLinkPowerDirector 7 ? Video Editing with CUDA-enabled filters and video encoding acceleration

· Folding@home ? distributed computing application

· GPUGRID ? distributed computing application

· MotionDSP vReveal ? Real time video enhancement / auto-fixing of personal video content

· Nero MoveIt ? CUDA transcoder announced at CeBit. Shipping in April

· PegasysTMPGEnc ? Transcoder using CUDA-accelerated video effects

· Seti@home ? distributed computing application

Bolded line is of interest as it states Video Editing.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
NOGHQ is working on porting CUDA to ATI... at first they got some support but then both companies got hostile and started sabotaging. So if it does happen it will be a while... openCL Will probably come first, and will actually be officially supported.

as was said, there is a limited number of CUDA capable apps, make sure yours is on the list.

Also, as said, ATIs 700 shaders are not the same as nvidia's shaders. So don't go by amount.
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
Originally posted by: taltamir
NOGHQ is working on porting CUDA to ATI... at first they got some support but then both companies got hostile and started sabotaging. So if it does happen it will be a while... openCL Will probably come first, and will actually be officially supported.

as was said, there is a limited number of CUDA capable apps, make sure yours is on the list.

Also, as said, ATIs 700 shaders are not the same as nvidia's shaders. So don't go by amount.

i cant see why either would sabotage. well, i suppose i can see a pro and a con for ATI. pro - supporting CUDA (with hacked drivers or whatever) might end up with more of their cards being sold without the burden of supporting CUDA. con being that it helps CUDA take off.

for NVIDIA though, i can only see a tiny con in helping ATI sell more cards. they really want CUDA to take off and i think wouldn't care about ATi sales as long as it means CUDA is mainstream by the time intel arrive with larrabee. if anything, nvidia seem to be nonchalant about intel entering their world. AMD seemed cocky about their position above intel with the X2 939, but then conroe arrived. Well, we all know what happened in that story.

If i was nvidia i'd try to make CUDA work on ATi cards - even if it meant doing it unofficially. anything to gain more users before the giant enters the game
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
ATi has 900 shaders of which only 800 usable.

Cyberlink Powerdirector 7 also supports ATi Stream
 

tjcinnamon

Member
Aug 17, 2006
133
0
0
I would be using CUDA for vReveal, RapiHD? Accelerator for Adobe Premiere® Pro , Affect Effects utilizes it, as well as future Adobe Creative Suites will use it.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
CUDA is used in adobe products only in a couple of filters and it won't become a main feature because apple is pushing OpenCL and as many users of adobe products are apple users they will most likely implement OpenCL over CUDA
 

Klinky1984

Member
Nov 21, 2007
48
0
66
Originally posted by: tjcinnamon
Would OpenCL utilize the massive number of stream processors on the 4870

Yes I imagine it would, but remember they are not the same as nVidia stream processors. They are a bit more simplified than nVidia's stream processors, so you cannot compare them directly.

Programs will need to be re-programmed to work with OpenCL. This is probably not going to happen for a long while. I would not base current generation purchases for something that is probably a couple generations away from picking up off the ground. By that time you'll have a wider selection of choices and there will be more data out to make it easier to know what to purchase. Right now you're trying to look into a crystal ball and by the time the future is here your card is going to be too old to take advantage of the new features.

RapiHD appears to be a nVidia QuadroCX exclusive product - you have to buy the bundle which is $2,000. Seems a little pricey. Bottom line is that it's very doubtful it would work with an ATi card or a lower-end nVidia card(or if you can even buy it separate).

vReveal seems a little gimmicky, the video previews are small and they don't even have a product available yet.

Cyberlink Powerdirector 7 appears to mainly use CUDA/Stream for encoding. Ofcourse you have to like this program and want to use it to see benefit. If you do, then it doesn't look like it matters as it supports both Stream & CUDA. Which one performs better might be a bit difficult to tell. Check NVIDIA CUDA performance - Video effect rendering with Cyberlink PowerDirector 7 Ultra , looks like for some effects it can boost rendering speeds. Not all of them amazing, but it's there.

At this point CUDA wouldn't be a deal breaker regarding video editing - unless you have a specific niche or task that would greatly benefit. I think you'd do better investing in a powerful CPU vs CUDA enabled GPU. Say if I had the choice up upgrading to a high-end i7 CPU vs a QuadroCX, I'd take the high-end i7.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
well, CS4 applications primarily use OpenGL for graphics acceleration so most of the graphics cards on the market will be able to accelerate stuff in CS4(obviously crappy cards like a Radeon 9250 or a Geforce 6200 won't be very good at this). Like i said, CUDA is only used in a couple of CS4 applications(don't know which exactly, i know that photoshop isn't one of them though) and even among those that use CUDA it's used only on a couple of filters and good luck finding which ones exactly :p At the end of the day GPGPU is very much at it's infancy right now so no point in picking sides yet.

Yes, OpenCL will be able to use the power of both ati and nvidia cards which is why i personally want it to "win" the GPGPU "wars" but like i said, don't base your buying decisions on this stuff yet unless of course you know that the applications that you're working with make use of CUDA/Stream/OpenCL/whatever
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Read the first portion of this page. Focus on the chart there at the top.

ATi cards of this generation feature 160 "full" stream processors that, under the right conditions, can each handle up to five parallel threads - accounting for the "800 stream processors" they list. If you look at the chart you'll see how this works - when the instruction sets are parallel they are processed faster and the card acts like it has more processors - but when the instructions are nonparallel they are handled in sequence and the card only functions like it has 160 processors.

Compare that to nVidia's G92 architecture (9800GT - 112 SP / 9800GTX - 128 SP) and GT200 architecture (GTX 260 - 192/216 SP / GTX 280+285 - 240 SP) where the stream processors "are what they are" - they only handle instructions in sequential order as received, no parallelism at all.

So it really comes down to how a game is coded. If entirely sequential the nVidia gear will win hands down thanks to the higher SP count (192 to 240 vs 160) but when a game is coded with heavy parallelism the 4800-series cards scale well and act like they've got closer to 800 SP instead of 160 - and under these conditions they will kick the nVidia hardware around pretty convincingly. You can see this kind of performance in a few games where the 4870 smashes the GTX 280 (like here in Bioshock).
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: SSChevy2001
ATi has 900 shaders of which only 800 usable.

Cyberlink Powerdirector 7 also supports ATi Stream

Yeah, stream didn't get a very good quality write up. It's fast, but there were reports by reviewers of really bad IQ in the form of unrendered blocks on screen. I'm assuming ATI fixed this? This is why I mentioned Stream Fizzling. I haven't heard news of any progress with it. Have you? If so, post it up man!!

What has 900 shaders with only 800 usable? RV770? First I'm hearing of this. Interesting.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: keysplayr2003
Originally posted by: SSChevy2001
ATi has 900 shaders of which only 800 usable.

Cyberlink Powerdirector 7 also supports ATi Stream

Yeah, stream didn't get a very good quality write up. It's fast, but there were reports by reviewers of really bad IQ in the form of unrendered blocks on screen. I'm assuming ATI fixed this? This is why I mentioned Stream Fizzling. I haven't heard news of any progress with it. Have you? If so, post it up man!!

What has 900 shaders with only 800 usable? RV770? First I'm hearing of this. Interesting.
Your talking about avivo video converter? From my personal experience avivo video converter is a POS, which is why I didn't bother to even mention it. I haven't heard much on it and personally I doubt it will ever get finished.

I'm actually thinking about pickup PowerDirector 7 to test out, being I can used it for my 8800GTS 512 and 4870 1GB.

Yeah the RV770 has 900 SP, would be nice if somehow they could be unlocked.
http://www.guru3d.com/news/ati...900-stream-processors/
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: keysplayr2003
Originally posted by: SSChevy2001
ATi has 900 shaders of which only 800 usable.

Cyberlink Powerdirector 7 also supports ATi Stream

Yeah, stream didn't get a very good quality write up. It's fast, but there were reports by reviewers of really bad IQ in the form of unrendered blocks on screen. I'm assuming ATI fixed this? This is why I mentioned Stream Fizzling. I haven't heard news of any progress with it. Have you? If so, post it up man!!

They seem to be working on it. I hope both companies end up using a one standard.
http://techreport.com/discussions.x/16576