• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Is it better to go with Nvidia or AMD for photoshop?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
So you call something you can't dispute, a strawman. Fine with me. Show me your usage ratings? Where are your charts showing usage dropping for CUDA in favor of OpenCL.
Because one increases from next to nothing in usage (OpenCL) does not mean that usage decreases for an extremely well used CUDA. Is this what logic tells you? Or do you have your charts. Show me those charts, and I'll consider them.
And again, vendor locked API doesn't matter when the vendor also supports everything else. Call it a strawman all you like. It's an argument you can't overcome no matter how you try or what you call it. Sorry.
And just for some reading for you:
http://developer.nvidia.com/cuda-tools-ecosystem
CUDA is widespread and growing every day.

Some more for you:
http://developer.nvidia.com/content/opencl

Strawman..... LOL.
 
Last edited:

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91

"A straw man is a type of argument and is an informal fallacy based on misrepresentation of an opponent's position. To "attack a straw man" is to create the illusion of having refuted a proposition by replacing it with a superficially similar yet unequivalent proposition (the "straw man"), and refuting it, without ever having actually refuted the original position."


NismoTigerWVU said:
But you'd have to dig pretty deep to find "proprietary stuff" that was vendor locked and became standard. Just as Glide/RRedline/all the vendor specific APIs of the mid 90's gave way to OpenGL and DirectX so too will CUDA. Sure there is the possibility of NV opening the standard to Intel and AMD, but that would happen happen after it's far too late. You might even be able to make a case for it already being too late with the momentum OpenCL has picked up recently.

Behold the stawman
Keysplayr said:
What does it matter if its vendor locked if the vendor also supports everything else?
Last I checked, Nvidia well supports OpenCL and chairs the OpenCL committee.
So thinking that using CUDA is the only way to use Nvidia GPUs would be an error in ones understanding.

Not once did I say that CUDA was the only way (or best, or anything else) to use NV chips. However, you decided to attack this idea rather than the position I actually presented; that vendor locked APIs tend to give way to vendor neutral ones. Now, no one can predict the future, but judging by what's happened in the past I'm comfortable with my opinion. That said, it's just that, my opinion and you are more than entitled to your own and, if you really felt the need to, you are more than free to refute mine as well. However, you'd actually have to refute what I said, not what would be more convenient for me to have said. Lastly, we get it, you really like Nvidia and you know what, that's cool. You're open and honest about this and leave no doubt that any opinion you express has this bias. However, this may actually be counter-productive for what you may be trying to achieve, as little episodes like this (lets use a strawman and make it look like he thinks CUDA isn't supported/used in industry, LOLZ see how big of an idiot he is) are highly unlikely to shift anyone else's views more inline with your own. That is unless you're actually an epic AMD troll, in for the girthy long troll of spending years masquerading as an NV troll to taint their public image. Before you go nuts, that last sentence was only a joke....or was it :D
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
You said vendor locked, and I have shown you how that makes no difference.
What you have shown me, is a lack of interest in acknowledging that, and the dictinary
definition of "strawman".
By you stating that being vendor locked makes a difference, it begs the question "how" when the vendor supports all other methods. This insinuates that being vendor locked renders the device a non choice. So you didnt have to actually say that CUDA was the only way to use an Nvidia gpu. To me, that is what ypur prior comments implied.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
You said vendor locked, and I have shown you how that makes no difference.
What you have shown me, is a lack of interest in acknowledging that, and the dictinary
definition of "strawman".
By you stating that being vendor locked makes a difference, it begs the question "how" when the vendor supports all other methods. This insinuates that being vendor locked renders the device a non choice. So you didnt have to actually say that CUDA was the only way to use an Nvidia gpu. To me, that is what ypur prior comments implied.
He's only made one comment in this thread.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
You said vendor locked, and I have shown you how that makes no difference.
What you have shown me, is a lack of interest in acknowledging that, and the dictinary
definition of "strawman".
By you stating that being vendor locked makes a difference, it begs the question "how" when the vendor supports all other methods. This insinuates that being vendor locked renders the device a non choice. So you didnt have to actually say that CUDA was the only way to use an Nvidia gpu. To me, that is what ypur prior comments implied.

Just a helpful tip, since you apparently didn't know what a strawman was, here is also the definition of "begging the question": http://begthequestion.info/
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
You said vendor locked, and I have shown you how that makes no difference.
What you have shown me, is a lack of interest in acknowledging that, and the dictinary
definition of "strawman".
By you stating that being vendor locked makes a difference, it begs the question "how" when the vendor supports all other methods. This insinuates that being vendor locked renders the device a non choice. So you didnt have to actually say that CUDA was the only way to use an Nvidia gpu. To me, that is what ypur prior comments implied.

Also, I'm at a loss as to how you could have possibly interpreted his post that way, unless you didn't read the words and just immediately went into "defend Nvidia" mode.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Also, I'm at a loss as to how you could have possibly interpreted his post that way, unless you didn't read the words and just immediately went into "defend Nvidia" mode.
I knew as soon as I mentioned the value of standards, Keys would respond the way he did, utterly predictable, but unfortunate because it ends up adding nothing to the discussion. Again, standards are what has brought us here in the first place, there would be no usable Internet without them. I can't understand why anyone would argue against them if they actually cared about technology.

Back to Adobe and Photoshop, Adobe has a nice page up that lists the improvements in CS6.
Mercury Graphics Engine

The Mercury Graphics Engine (MGE) represents features that use video card processor, or GPU, acceleration. In Photoshop CS6, this new engine delivers near-instant results when editing with key tools such as Liquify, Warp, Lighting Effects, and the Oil Paint filter. The new MGE delivers unprecedented responsiveness for a fluid feel as you work.


MGE is new to Photoshop CS6 and uses both the OpenGL and OpenCL frameworks. It does not use the proprietary CUDA framework from nVidia.
I sincerely hope Nvidia fully supports OpenCL and dumps CUDA for good.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
I knew as soon as I mentioned the value of standards, Keys would respond the way he did, utterly predictable, but unfortunate because it ends up adding nothing to the discussion. Again, standards are what has brought us here in the first place, there would be no usable Internet without them. I can't understand why anyone would argue against them if they actually cared about technology.

Back to Adobe and Photoshop, Adobe has a nice page up that lists the improvements in CS6.

I sincerely hope Nvidia fully supports OpenCL and dumps CUDA for good.

No need to beat a dead horse on this one and let derail further, but I don't know if I'd agree that dropping CUDA support would be a good thing (assuming that's what you mean by dumping it). In my opinion, it would be wise to promote/optimize for OpenCL for mainstream usage, but maintain CUDA support for niche, custom written work (think academia). That and it would be a PR nightmare if they burnt those who've adopted the platform. Kudos to Adobe for making their flagship product more desirable and supporting a greater share of the market here.
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
No need to beat a dead horse on this one and let derail further, but I don't know if I'd agree that dropping CUDA support would be a good thing (assuming that's what you mean by dumping it). In my opinion, it would be wise to promote/optimize for OpenCL for mainstream usage, but maintain CUDA support for niche, custom written work (think academia). That and it would be a PR nightmare if they burnt those who've adopted the platform. Kudos to Adobe for making their flagship product more desirable and supporting a greater share of the market here.

So dropping CUDA was a bad thing, but yay for supporting a greater share of the market! I'd say yay for supporting an open standard and dropping the proprietary completely. Technical functionality being equal, open standards are ALWAYS better than closed, there's no way around it.
That applies to everyone, but for me personally if I were designing a product I'd pick the open standard over proprietary even if it were slightly behind. At least I'm not locked into a standard that could be folded by 1 single company tomorrow making the decision to do so.

Check the links provided. Read carefully then make a choice based on your needs.

Those links don't detail that Nvidia offers better CS6 OpenCL acceleration. They just say it's "supported". So is Intel integrated.

For CS5, it's CUDA, so nvidia, yeah.
Premiere and CS6 has OpenCL which only AMD cards have certified support for. There are features in CS6 that are supported by nvidia cards, but not all of them.
If you read their blogs, it lays it all out.
AMD is best for Adobe CS6.
AMD supports all features including OpenCL.
OpenCL is not supported for nvidia cards in CS6.
So AMD is the way to go.

In my experience, this is true for CS6.
 
Last edited:

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
So dropping CUDA was a bad thing, but yay for supporting a greater share of the market! I'd say yay for supporting an open standard. Technical functionality being equal, open standards are ALWAYS better than closed, there's no way around it.
That applies to everyone, but for me personally if I were designing a product I'd pick the open standard over proprietary even if it were slightly behind. At least I'm not locked into a standard that could be folded by 1 single company tomorrow making the decision to do so.

Oh man, I forgot quote the post above mine on that, but I was referring to the quote that went something along the lines of "I hope NV drops CUDA". I was referring to NV dropping support and basically pulling the rug out from underneath anyone using it now. Now, I feel if NV shifted their emphasis to OpenCL while maintaining (if not necessarily advancing) CUDA they would be better off (as would the industry as a whole). Open standards gaining traction over vendor-locked incumbents is a great thing.

Edit: Well I guess I did quote it, but hopefully I clarified it a bit better here
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
I hear ya, I get tired of waiting years and years to see things move on though. I see the sentiment but I prefer to cut the snake off at the head as CS6 did. Just end it (proprietary CUDA) now and get the long wait over with. This is a big step towards that.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Dropping CUDA means Adobe not doing any further development, they will (and do) still support their CUDA enabled products, no rug pulling needs to occur.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
Dropping CUDA means Adobe not doing any further development, they will (and do) still support their CUDA enabled products, no rug pulling needs to occur.

Oh, well in that case, I agree that Adobe dropping support of CUDA is a great thing. I thought you meant NV (at the driver level for example), which is why I had some pause. Resources spent maintaining legacy code only useful to one vendor are wasted. To be honest, if I was project manager at Adobe (project manager of Photoshop, woah, wouldn't that be an awesome position) even if NV offered to do all the coding, I think I'd still pull the plug to minimize potential bug issues and to maintain a consistent product.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
If you mean CS5, yes. But the rug has been pulled, The Way It Should Be- http://forums.adobe.com/message/4289204
Not sure if I would use that term, but I am happy to see Adobe has embraced OpenCL/GL to the extent they have. What I find really cool is even on a lower end notebook you still get the performance improvements. And there is no reason these features won't fully work on Nvidia hardware, in fact I fully expect them to shortly.
Oh, well in that case, I agree that Adobe dropping support of CUDA is a great thing. I thought you meant NV (at the driver level for example), which is why I had some pause.
No I don't want to see anything like that.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
And there is no reason these features won't fully work on Nvidia hardware, in fact I fully expect them to shortly.

Actually if you look at the hardware page Adobe has certified a lot of Nvidia cards for CS6.

Adobe switching to OpenCL/GL couldn't have come at a better time with the way GPGPU was stipped from the current GTX 600 cards. So no one has to pay for Quadro/Tesla cards just to get CUDA muscle.
 

gorobei

Diamond Member
Jan 7, 2007
4,020
1,519
136
for those that didnt read the toms article on OpenCL with the Russell Williams(adobe head of PS science development) interview, he generally indicates that they are pushing openCL now with cs6 and even more so with cs7.

OpenGL is fine for anything requiring a vector function, but more of the PS tools will benefit from from openCL on the cpu/apu due to lower latency, less memory transfer, and the fact that most of the tools dont need the full horsepower of a discreet gpu.

they seem to be very keyed on giving mobile users more openCL accelerated tools, given that are the ones least likely to have powerful gpus onboard.

in all likelihood there will be very little difference between gpu openCL performance on adobe products down the line, since they are probably targeting intel igp as the most common denominator.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I sincerely hope Nvidia fully supports OpenCL and dumps CUDA for good.


Because CUDA is not the main NVIDIA's comparative advantage and the very foundation of hugely successful Professional Business?

:colbert:

Guess what the green stands for:

cuda3a2uje.png
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
nVidia has to be hurting about this. It might be a while before they get up to speed. They'll either have to take resources from elsewhere (CUDA/PhysX?) or add an additional layer of support. We all know how much nVidia likes to promote their own closed standards.

Oh, and if you are using CS5, nVidia might be a bit better. Splitting hairs here, but that is often the case.

??,if he is using CS5 (cuda), then surely its a lot better, as AMD doesnt accelerate CS5...no?
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
Because CUDA is not the main NVIDIA's comparative advantage and the very foundation of hugely successful Professional Business?

:colbert:

Guess what the green stands for:

cuda3a2uje.png

Actually if you read through the thread, you'd see that he was referring to photoshop not CUDA in general.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
OpenCL is run through CUDA on CUDA hardware.

....and that matters because? I fail to see how an added abstraction layer is anything worth caring about? I mean if you wanted to make a "cuda cuda cuda" comment you could say that everything runs through "cuda" as the shader cores are called "cuda cores" on their cards. There's quite a difference between simply using something and benefiting from using something.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
OpenCL is run through CUDA on CUDA hardware.



Exactly. You can't drop CUDA.

That's like saying AMD should drop GCN, because it's proprietary.
Only difference being Nvidia actually has well developed and accepted tools/programming environment
 
Last edited:

Lorne

Senior member
Feb 5, 2001
873
1
76
With CS5.5 or CS6 openCL tries to use all available sources that its programed to search for at this time,, So as programmers learn how to manipulate the code updates will make improvment speeds in spacific area even in simple rendering.
The CUDA rendering was just the first step at this by Nvidia because there programmers knew where to manipulate the coding.

Single core GFX are only supported at this time but OCL opens the ability for programmers to code for multiple core, SLI/Crossfire and even heterogeneous mixed brands (Like the way AMD graphics card users have used a extra lower end Nvidia card ( For you FANBOYS, I mean model number, Not that Nvidia is lower end) to get the Physics affect for gaming).

With CS6 just a fastest card you can afford or by the brand you like is all that matters.
 
Last edited: