Is "compute" different from "graphics processing"?

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
I'm not sure that I actually understand this whole BitCoin thing. I don't really have an interest in BitCoin mining, except that I hope it goes away soon. The used market on GPU is woefully unrealistic for an average gamer due to bitcoins.

So, what is the difference between "compute" like that which is used for Bitcoin mining, and the processing of graphics or whatever is needed to play games? Are they the same thing? And if not (which is my assumption), why aren't there more mainstream "compute" cards?

It seems to be that BitCoin mining creates a premium on enthusiast GPU cards, when really it is compute cards that should have this premium instead.

But then again, I don't understand Bitcoin, as I've said. Seems to me its just printing your own money. Even if it does get harder and harder to acquire a bitcoin, I don't understand how this translates to anything other than people printing their own money.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Yes, its very different as such. It also raises the complexity of the GPUs.

HPC computen cards require high DP output. While gaming compute cards mainly do SP due to limited DP speed.

A great gaming card can be very fast in gaming, and a fast compute card can be anything, even unable to do graphics.
 
Last edited:
Feb 25, 2011
16,790
1,469
126
So buy nVidia - the BTC miners are all using AMD cards.

I've always bought whatever seemed like the best price/performance within my budget, AMD or nVidia, and I've never been dissatisfied. Some people get all fanboy-ish, and some people play a specific game that is better optimized for X cards, but overall? Doesn't matter.

The one exception was getting Eyefinity, which was awesome (one I got used to the motion sickness) but nVidia has that now.
 
Last edited:

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
Hmm... interesting. I don't have a particular issue with nVidia. But I've recently upgraded to Eyefinity and have a woefully underpowered GPU as a result (7970 & FX-8350). I've been considering pairing a 7970, 7950, or 280X with it to give it a bit more life, but pricing is discouraging. With all the talk about Bitcoins value & the subsequent rush to mine, it has me wondering about these sorts of things.
 

NTMBK

Lifer
Nov 14, 2011
10,237
5,020
136
If I remember correctly, AMD cards are good for cryptocurrency mining because they have better integer performance than NVidia cards. "Mining" is basically just mangling integer hashes all day long, so it loves that stuff.

Traditional "compute" applications are HPC stuff, like fluid dynamics, and they tend to go in for lots of double precision floating point arithmetic. Very different thing.

(I do wonder how well a Xeon Phi would do at Litecoin mining...)
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
So, what is the difference between "compute" like that which is used for Bitcoin mining, and the processing of graphics or whatever is needed to play games?
Almost nothing. As it turns out, all the math processing required for 3D graphics is also useful in a general compute context. As a result with a small number of modifications you can take a GPU design intended for games and use it for compute workloads. Or more specifically, because the tasks are so similar, you can design a chip that excels at both at the same time.

At the end of the day it's all just math; piles and piles of highly parallel math.

And if not (which is my assumption), why aren't there more mainstream "compute" cards?
Well there are and there aren't. There are in the sense that you can go do compute work on any modern graphics card. So as the Bitcoin guys have shown, they can go buy a "consumer" level 7970 and go get the job done. But if you're talking about professional cards with more specialized features, the reason is what the market will bear. Professional compute cards are expensive because the user base for them is willing to pay $2000+, and they're willing to do that because of the amount of performance they're getting per dollar is in many cases unrivaled. From a development standpoint these cards do cost more to put together - the hardware and software validation is much deeper and more expensive than what's done with the consumer cards - so there are additional costs, but in the long run the profit margin on compute cards is much higher than most consumer cards. Right now there's no need to bring prices down lower; a thousand bucks for a GTX Titan is already a steal for most buyers.
 
Last edited:

hyrule4927

Senior member
Feb 9, 2012
359
1
76
If I remember correctly, AMD cards are good for cryptocurrency mining because they have better integer performance than NVidia cards. "Mining" is basically just mangling integer hashes all day long, so it loves that stuff.

Traditional "compute" applications are HPC stuff, like fluid dynamics, and they tend to go in for lots of double precision floating point arithmetic. Very different thing.

(I do wonder how well a Xeon Phi would do at Litecoin mining...)

"Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs (BIT_ALIGN_INT), but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add). This alone gives AMD another 1.7x performance advantage (~1900 instructions instead of ~3250 to execute the SHA-256 compression function)."

Source
 

Emulex

Diamond Member
Jan 28, 2001
9,759
1
71
nobody is using GPU any more for mining. Dedicated SHA256 usb board can smoke any video card by a long shot.

for the same power footprint you can do 100x more sha256 than any AMD/nvidia card.

Serious hashes have equivalent of 100000x the best AMD gpu in a rack. seriously.

Also consumer cards are too high powered and will pretty much self-destruct if you run them in compute tasks. Kind of like how consumer sata ssd drives are not rated for 24x7 100% utilization.

Some of those nuts overclocking GTX480 cpu's at TWICE the voltage the quadro (6000) I have which runs at 0.9v across all power levels (folks run 480's at 1.7) which is 4x more thermal output than the quadro. You just can't run that 24x7 and expect it to last a long time.

Compute also needs a ton of CPU-> GPU bandwidth where as gaming tends to preload textures. Compute has 4-12gb per CORE, up to 4 cores to handle monstrous compute and render (even if it has no video out it can drive another video card for screen output)

Nobody in their right mind would bitcoin mine with a gpu. You might as well piss money into the wind.
 

bononos

Diamond Member
Aug 21, 2011
3,889
158
106
It all comes down to: AMD for bitcoining, nVidia for gaming. 'Nuff said.

This is totally false and inflammatory. AMD has had a better price/performance ratio for gamers for a while now even when the uneven frame latency was an issue.
 

bononos

Diamond Member
Aug 21, 2011
3,889
158
106
......

Well there are and there aren't. There are in the sense that you can go do compute work on any modern graphics card. So as the Bitcoin guys have shown, they can go buy a "consumer" level 7970 and go get the job done. But if you're talking about professional cards with more specialized features, the reason is what the market will bear. Professional compute cards are expensive because the user base for them is willing to pay $2000+, and they're willing to do that because of the amount of performance they're getting per dollar is in many cases unrivaled. From a development standpoint these cards do cost more to put together - the hardware and software validation is much deeper and more expensive than what's done with the consumer cards - so there are additional costs, but in the long run the profit margin on compute cards is much higher than most consumer cards. Right now there's no need to bring prices down lower; a thousand bucks for a GTX Titan is already a steal for most buyers.

Isn't there a significant different btwn AMD/Nvidia in double precision/ compute performance? I remembered alot of moaning and groaning going on about how Kepler was crippled so that it performance worse than the Fermi generation for market segmentation.
Titan gave improved dp/compute but it still performs somewhere around the 7970GE.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Isn't there a significant different btwn AMD/Nvidia in double precision/ compute performance? I remembered alot of moaning and groaning going on about how Kepler was crippled so that it performance worse than the Fermi generation for market segmentation.
Titan gave improved dp/compute but it still performs somewhere around the 7970GE.
It depends on if we're talking about consumer cards or professional cards. Consumer level Kepler cards (GK104/106/107) weren't engineered with high double precision performance as a die space tradeoff. They were in a sense dedicated consumer parts since most (but not all) professional compute workloads are double precision, whereas consumer workloads are single precision.

Otherwise the performance of Kepler versus anything else depends largely on the algorithms. Compared to Fermi there were some scenarios where consumer Kepler regressed, particularly in anything where it was difficult to extract ILP.